[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 30583 1726853664.39483: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 30583 1726853664.39776: Added group all to inventory 30583 1726853664.39778: Added group ungrouped to inventory 30583 1726853664.39780: Group all now contains ungrouped 30583 1726853664.39783: Examining possible inventory source: /tmp/network-iHm/inventory.yml 30583 1726853664.51260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 30583 1726853664.51333: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 30583 1726853664.51350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 30583 1726853664.51417: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 30583 1726853664.51467: Loaded config def from plugin (inventory/script) 30583 1726853664.51469: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 30583 1726853664.51498: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 30583 1726853664.51554: Loaded config def from plugin (inventory/yaml) 30583 1726853664.51556: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 30583 1726853664.51620: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 30583 1726853664.51897: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 30583 1726853664.51900: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 30583 1726853664.51902: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 30583 1726853664.51906: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 30583 1726853664.51910: Loading data from /tmp/network-iHm/inventory.yml 30583 1726853664.51952: /tmp/network-iHm/inventory.yml was not parsable by auto 30583 1726853664.51996: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 30583 1726853664.52024: Loading data from /tmp/network-iHm/inventory.yml 30583 1726853664.52079: group all already in inventory 30583 1726853664.52084: set inventory_file for managed_node1 30583 1726853664.52087: set inventory_dir for managed_node1 30583 1726853664.52088: Added host managed_node1 to inventory 30583 1726853664.52090: Added host managed_node1 to group all 30583 1726853664.52090: set ansible_host for managed_node1 30583 1726853664.52091: set ansible_ssh_extra_args for managed_node1 30583 1726853664.52093: set inventory_file for managed_node2 30583 1726853664.52095: set inventory_dir for managed_node2 30583 1726853664.52095: Added host managed_node2 to inventory 30583 1726853664.52096: Added host managed_node2 to group all 30583 1726853664.52097: set ansible_host for managed_node2 30583 1726853664.52097: set ansible_ssh_extra_args for managed_node2 30583 1726853664.52099: set inventory_file for managed_node3 30583 1726853664.52100: set inventory_dir for managed_node3 30583 1726853664.52100: Added host managed_node3 to inventory 30583 1726853664.52101: Added host managed_node3 to group all 30583 1726853664.52102: set ansible_host for managed_node3 30583 1726853664.52102: set ansible_ssh_extra_args for managed_node3 30583 1726853664.52104: Reconcile groups and hosts in inventory. 30583 1726853664.52106: Group ungrouped now contains managed_node1 30583 1726853664.52107: Group ungrouped now contains managed_node2 30583 1726853664.52108: Group ungrouped now contains managed_node3 30583 1726853664.52159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 30583 1726853664.52247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 30583 1726853664.52280: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 30583 1726853664.52298: Loaded config def from plugin (vars/host_group_vars) 30583 1726853664.52299: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 30583 1726853664.52304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 30583 1726853664.52309: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 30583 1726853664.52336: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 30583 1726853664.52576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853664.52652: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 30583 1726853664.52679: Loaded config def from plugin (connection/local) 30583 1726853664.52681: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 30583 1726853664.53073: Loaded config def from plugin (connection/paramiko_ssh) 30583 1726853664.53076: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 30583 1726853664.53909: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30583 1726853664.53948: Loaded config def from plugin (connection/psrp) 30583 1726853664.53951: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 30583 1726853664.54626: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30583 1726853664.54661: Loaded config def from plugin (connection/ssh) 30583 1726853664.54664: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 30583 1726853664.56465: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30583 1726853664.56503: Loaded config def from plugin (connection/winrm) 30583 1726853664.56506: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 30583 1726853664.56534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 30583 1726853664.56612: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 30583 1726853664.56679: Loaded config def from plugin (shell/cmd) 30583 1726853664.56680: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 30583 1726853664.56704: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 30583 1726853664.56760: Loaded config def from plugin (shell/powershell) 30583 1726853664.56761: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 30583 1726853664.56809: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 30583 1726853664.56974: Loaded config def from plugin (shell/sh) 30583 1726853664.56977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 30583 1726853664.57007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 30583 1726853664.57117: Loaded config def from plugin (become/runas) 30583 1726853664.57119: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 30583 1726853664.57290: Loaded config def from plugin (become/su) 30583 1726853664.57292: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 30583 1726853664.57437: Loaded config def from plugin (become/sudo) 30583 1726853664.57439: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 30583 1726853664.57469: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 30583 1726853664.57764: in VariableManager get_vars() 30583 1726853664.57790: done with get_vars() 30583 1726853664.57910: trying /usr/local/lib/python3.12/site-packages/ansible/modules 30583 1726853664.62266: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 30583 1726853664.62404: in VariableManager get_vars() 30583 1726853664.62410: done with get_vars() 30583 1726853664.62418: variable 'playbook_dir' from source: magic vars 30583 1726853664.62419: variable 'ansible_playbook_python' from source: magic vars 30583 1726853664.62420: variable 'ansible_config_file' from source: magic vars 30583 1726853664.62420: variable 'groups' from source: magic vars 30583 1726853664.62421: variable 'omit' from source: magic vars 30583 1726853664.62422: variable 'ansible_version' from source: magic vars 30583 1726853664.62422: variable 'ansible_check_mode' from source: magic vars 30583 1726853664.62423: variable 'ansible_diff_mode' from source: magic vars 30583 1726853664.62424: variable 'ansible_forks' from source: magic vars 30583 1726853664.62425: variable 'ansible_inventory_sources' from source: magic vars 30583 1726853664.62425: variable 'ansible_skip_tags' from source: magic vars 30583 1726853664.62426: variable 'ansible_limit' from source: magic vars 30583 1726853664.62427: variable 'ansible_run_tags' from source: magic vars 30583 1726853664.62427: variable 'ansible_verbosity' from source: magic vars 30583 1726853664.62467: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml 30583 1726853664.63175: in VariableManager get_vars() 30583 1726853664.63193: done with get_vars() 30583 1726853664.63242: in VariableManager get_vars() 30583 1726853664.63256: done with get_vars() 30583 1726853664.63315: in VariableManager get_vars() 30583 1726853664.63329: done with get_vars() 30583 1726853664.63381: in VariableManager get_vars() 30583 1726853664.63400: done with get_vars() 30583 1726853664.63451: in VariableManager get_vars() 30583 1726853664.63468: done with get_vars() 30583 1726853664.63524: in VariableManager get_vars() 30583 1726853664.63538: done with get_vars() 30583 1726853664.63597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 30583 1726853664.63617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 30583 1726853664.63857: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 30583 1726853664.64085: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 30583 1726853664.64088: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 30583 1726853664.64118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 30583 1726853664.64142: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 30583 1726853664.64382: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 30583 1726853664.64696: Loaded config def from plugin (callback/default) 30583 1726853664.64698: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 30583 1726853664.65947: Loaded config def from plugin (callback/junit) 30583 1726853664.65950: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 30583 1726853664.65998: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 30583 1726853664.66066: Loaded config def from plugin (callback/minimal) 30583 1726853664.66068: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 30583 1726853664.66148: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 30583 1726853664.66229: Loaded config def from plugin (callback/tree) 30583 1726853664.66231: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 30583 1726853664.66348: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 30583 1726853664.66351: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_states_nm.yml ************************************************** 2 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 30583 1726853664.66609: in VariableManager get_vars() 30583 1726853664.66623: done with get_vars() 30583 1726853664.66629: in VariableManager get_vars() 30583 1726853664.66638: done with get_vars() 30583 1726853664.66642: variable 'omit' from source: magic vars 30583 1726853664.66682: in VariableManager get_vars() 30583 1726853664.66696: done with get_vars() 30583 1726853664.66723: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_states.yml' with nm as provider] *********** 30583 1726853664.68962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 30583 1726853664.69031: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 30583 1726853664.69058: getting the remaining hosts for this loop 30583 1726853664.69059: done getting the remaining hosts for this loop 30583 1726853664.69062: getting the next task for host managed_node2 30583 1726853664.69065: done getting next task for host managed_node2 30583 1726853664.69066: ^ task is: TASK: Gathering Facts 30583 1726853664.69068: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853664.69070: getting variables 30583 1726853664.69072: in VariableManager get_vars() 30583 1726853664.69081: Calling all_inventory to load vars for managed_node2 30583 1726853664.69084: Calling groups_inventory to load vars for managed_node2 30583 1726853664.69086: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853664.69096: Calling all_plugins_play to load vars for managed_node2 30583 1726853664.69110: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853664.69113: Calling groups_plugins_play to load vars for managed_node2 30583 1726853664.69141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853664.69197: done with get_vars() 30583 1726853664.69204: done getting variables 30583 1726853664.69282: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 Friday 20 September 2024 13:34:24 -0400 (0:00:00.030) 0:00:00.030 ****** 30583 1726853664.69304: entering _queue_task() for managed_node2/gather_facts 30583 1726853664.69306: Creating lock for gather_facts 30583 1726853664.69684: worker is 1 (out of 1 available) 30583 1726853664.69700: exiting _queue_task() for managed_node2/gather_facts 30583 1726853664.69715: done queuing things up, now waiting for results queue to drain 30583 1726853664.69717: waiting for pending results... 30583 1726853664.70002: running TaskExecutor() for managed_node2/TASK: Gathering Facts 30583 1726853664.70010: in run() - task 02083763-bbaf-05ea-abc5-00000000001b 30583 1726853664.70100: variable 'ansible_search_path' from source: unknown 30583 1726853664.70428: calling self._execute() 30583 1726853664.70780: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853664.70784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853664.70787: variable 'omit' from source: magic vars 30583 1726853664.70835: variable 'omit' from source: magic vars 30583 1726853664.70870: variable 'omit' from source: magic vars 30583 1726853664.71021: variable 'omit' from source: magic vars 30583 1726853664.71070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853664.71141: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853664.71236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853664.71268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853664.71288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853664.71354: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853664.71441: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853664.71450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853664.71682: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853664.71694: Set connection var ansible_timeout to 10 30583 1726853664.71701: Set connection var ansible_connection to ssh 30583 1726853664.71711: Set connection var ansible_shell_executable to /bin/sh 30583 1726853664.71717: Set connection var ansible_shell_type to sh 30583 1726853664.71730: Set connection var ansible_pipelining to False 30583 1726853664.71979: variable 'ansible_shell_executable' from source: unknown 30583 1726853664.71982: variable 'ansible_connection' from source: unknown 30583 1726853664.71984: variable 'ansible_module_compression' from source: unknown 30583 1726853664.71986: variable 'ansible_shell_type' from source: unknown 30583 1726853664.71988: variable 'ansible_shell_executable' from source: unknown 30583 1726853664.71990: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853664.71992: variable 'ansible_pipelining' from source: unknown 30583 1726853664.71993: variable 'ansible_timeout' from source: unknown 30583 1726853664.71995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853664.72180: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 30583 1726853664.72315: variable 'omit' from source: magic vars 30583 1726853664.72328: starting attempt loop 30583 1726853664.72335: running the handler 30583 1726853664.72352: variable 'ansible_facts' from source: unknown 30583 1726853664.72378: _low_level_execute_command(): starting 30583 1726853664.72389: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853664.73191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853664.73267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853664.73297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853664.73332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853664.73577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853664.75272: stdout chunk (state=3): >>>/root <<< 30583 1726853664.75365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853664.75714: stderr chunk (state=3): >>><<< 30583 1726853664.75718: stdout chunk (state=3): >>><<< 30583 1726853664.75721: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853664.75724: _low_level_execute_command(): starting 30583 1726853664.75726: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615 `" && echo ansible-tmp-1726853664.7564113-30602-205863084595615="` echo /root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615 `" ) && sleep 0' 30583 1726853664.76891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853664.77030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853664.77127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853664.79294: stdout chunk (state=3): >>>ansible-tmp-1726853664.7564113-30602-205863084595615=/root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615 <<< 30583 1726853664.79619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853664.79778: stderr chunk (state=3): >>><<< 30583 1726853664.79795: stdout chunk (state=3): >>><<< 30583 1726853664.79875: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853664.7564113-30602-205863084595615=/root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853664.79879: variable 'ansible_module_compression' from source: unknown 30583 1726853664.80007: ANSIBALLZ: Using generic lock for ansible.legacy.setup 30583 1726853664.80083: ANSIBALLZ: Acquiring lock 30583 1726853664.80096: ANSIBALLZ: Lock acquired: 139827455545936 30583 1726853664.80109: ANSIBALLZ: Creating module 30583 1726853665.36258: ANSIBALLZ: Writing module into payload 30583 1726853665.36477: ANSIBALLZ: Writing module 30583 1726853665.36480: ANSIBALLZ: Renaming module 30583 1726853665.36482: ANSIBALLZ: Done creating module 30583 1726853665.36525: variable 'ansible_facts' from source: unknown 30583 1726853665.36543: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853665.36603: _low_level_execute_command(): starting 30583 1726853665.36607: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 30583 1726853665.37177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853665.37206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853665.37282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853665.37423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853665.37426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853665.37428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853665.37590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853665.39389: stdout chunk (state=3): >>>PLATFORM <<< 30583 1726853665.39392: stdout chunk (state=3): >>>Linux <<< 30583 1726853665.39438: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 30583 1726853665.39579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853665.39696: stderr chunk (state=3): >>><<< 30583 1726853665.39699: stdout chunk (state=3): >>><<< 30583 1726853665.39725: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853665.39730 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 30583 1726853665.39803: _low_level_execute_command(): starting 30583 1726853665.39807: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 30583 1726853665.40177: Sending initial data 30583 1726853665.40182: Sent initial data (1181 bytes) 30583 1726853665.40876: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853665.40880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853665.40883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853665.40885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853665.40887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853665.41036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853665.41068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853665.41189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853665.41193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853665.44756: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 30583 1726853665.45407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853665.45676: stderr chunk (state=3): >>><<< 30583 1726853665.45680: stdout chunk (state=3): >>><<< 30583 1726853665.45683: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853665.45685: variable 'ansible_facts' from source: unknown 30583 1726853665.45687: variable 'ansible_facts' from source: unknown 30583 1726853665.45690: variable 'ansible_module_compression' from source: unknown 30583 1726853665.45692: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30583 1726853665.45694: variable 'ansible_facts' from source: unknown 30583 1726853665.46163: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615/AnsiballZ_setup.py 30583 1726853665.46704: Sending initial data 30583 1726853665.46770: Sent initial data (154 bytes) 30583 1726853665.47854: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853665.47870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853665.47932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853665.48000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853665.48018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853665.48040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853665.48150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853665.49946: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853665.50021: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853665.50109: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpp0czz19a /root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615/AnsiballZ_setup.py <<< 30583 1726853665.50118: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615/AnsiballZ_setup.py" <<< 30583 1726853665.50197: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpp0czz19a" to remote "/root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615/AnsiballZ_setup.py" <<< 30583 1726853665.52418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853665.52583: stderr chunk (state=3): >>><<< 30583 1726853665.52587: stdout chunk (state=3): >>><<< 30583 1726853665.52589: done transferring module to remote 30583 1726853665.52591: _low_level_execute_command(): starting 30583 1726853665.52593: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615/ /root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615/AnsiballZ_setup.py && sleep 0' 30583 1726853665.53229: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853665.53305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853665.53363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853665.53446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853665.55418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853665.55422: stdout chunk (state=3): >>><<< 30583 1726853665.55424: stderr chunk (state=3): >>><<< 30583 1726853665.55479: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853665.55483: _low_level_execute_command(): starting 30583 1726853665.55485: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615/AnsiballZ_setup.py && sleep 0' 30583 1726853665.56135: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853665.56149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853665.56167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853665.56186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853665.56239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853665.56305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853665.56335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853665.56355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853665.56483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853665.58806: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 30583 1726853665.58830: stdout chunk (state=3): >>>import _imp # builtin <<< 30583 1726853665.58866: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 30583 1726853665.58930: stdout chunk (state=3): >>>import '_io' # <<< 30583 1726853665.58944: stdout chunk (state=3): >>>import 'marshal' # <<< 30583 1726853665.58974: stdout chunk (state=3): >>>import 'posix' # <<< 30583 1726853665.59015: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 30583 1726853665.59036: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 30583 1726853665.59095: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 30583 1726853665.59123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 30583 1726853665.59132: stdout chunk (state=3): >>>import 'codecs' # <<< 30583 1726853665.59160: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 30583 1726853665.59208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a4e84d0> <<< 30583 1726853665.59235: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a4b7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a4eaa50> <<< 30583 1726853665.59258: stdout chunk (state=3): >>>import '_signal' # <<< 30583 1726853665.59293: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 30583 1726853665.59314: stdout chunk (state=3): >>>import 'io' # <<< 30583 1726853665.59342: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 30583 1726853665.59451: stdout chunk (state=3): >>>import '_collections_abc' # <<< 30583 1726853665.59463: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 30583 1726853665.59512: stdout chunk (state=3): >>>import 'os' # <<< 30583 1726853665.59541: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 30583 1726853665.59576: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 30583 1726853665.59596: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 30583 1726853665.59608: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a299130> <<< 30583 1726853665.59682: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a299fa0> <<< 30583 1726853665.59708: stdout chunk (state=3): >>>import 'site' # <<< 30583 1726853665.59734: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30583 1726853665.60138: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 30583 1726853665.60148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 30583 1726853665.60178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853665.60189: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 30583 1726853665.60231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 30583 1726853665.60263: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 30583 1726853665.60266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 30583 1726853665.60293: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2d7dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 30583 1726853665.60320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 30583 1726853665.60373: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2d7fe0> <<< 30583 1726853665.60376: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 30583 1726853665.60409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 30583 1726853665.60411: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 30583 1726853665.60483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 30583 1726853665.60526: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a30f800> <<< 30583 1726853665.60541: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 30583 1726853665.60578: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a30fe90> import '_collections' # <<< 30583 1726853665.60621: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2efaa0> <<< 30583 1726853665.60632: stdout chunk (state=3): >>>import '_functools' # <<< 30583 1726853665.60658: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2ed1c0> <<< 30583 1726853665.60760: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2d4f80> <<< 30583 1726853665.60779: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 30583 1726853665.60831: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 30583 1726853665.60835: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 30583 1726853665.60874: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 30583 1726853665.60909: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 30583 1726853665.60913: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a32f6e0> <<< 30583 1726853665.60959: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a32e300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2ee060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2d6e70> <<< 30583 1726853665.61022: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 30583 1726853665.61066: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a3647a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2d4200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 30583 1726853665.61093: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a364c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a364b00> <<< 30583 1726853665.61137: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853665.61155: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a364ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2d2d20> <<< 30583 1726853665.61223: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853665.61227: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 30583 1726853665.61276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 30583 1726853665.61327: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a3655b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a365280> import 'importlib.machinery' # <<< 30583 1726853665.61331: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a3664b0> <<< 30583 1726853665.61350: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 30583 1726853665.61418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 30583 1726853665.61438: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a37c680> import 'errno' # <<< 30583 1726853665.61467: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853665.61494: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a37dd30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 30583 1726853665.61537: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 30583 1726853665.61601: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a37ebd0> <<< 30583 1726853665.61606: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a37f230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a37e120> <<< 30583 1726853665.61622: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 30583 1726853665.61668: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853665.61691: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a37fcb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a37f3e0> <<< 30583 1726853665.61717: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a366450> <<< 30583 1726853665.61743: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 30583 1726853665.61773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 30583 1726853665.61807: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 30583 1726853665.61853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 30583 1726853665.61858: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a08bbc0> <<< 30583 1726853665.61929: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a0b4710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a0b4470> <<< 30583 1726853665.61940: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a0b46b0> <<< 30583 1726853665.61961: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 30583 1726853665.62034: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853665.62166: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a0b4fe0> <<< 30583 1726853665.62310: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a0b5910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a0b4890> <<< 30583 1726853665.62361: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a089d60> <<< 30583 1726853665.62366: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 30583 1726853665.62420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 30583 1726853665.62423: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a0b6cc0> <<< 30583 1726853665.62448: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a0b5790> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a366ba0> <<< 30583 1726853665.62485: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 30583 1726853665.62555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853665.62561: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 30583 1726853665.62616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 30583 1726853665.62619: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a0e3020> <<< 30583 1726853665.62697: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 30583 1726853665.62700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853665.62730: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 30583 1726853665.62781: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a1033e0> <<< 30583 1726853665.62791: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 30583 1726853665.62844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 30583 1726853665.62907: stdout chunk (state=3): >>>import 'ntpath' # <<< 30583 1726853665.62924: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a164200> <<< 30583 1726853665.62960: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 30583 1726853665.62994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 30583 1726853665.62997: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 30583 1726853665.63028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 30583 1726853665.63119: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a166960> <<< 30583 1726853665.63201: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a164320> <<< 30583 1726853665.63230: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a1311f0> <<< 30583 1726853665.63285: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939f6d2e0> <<< 30583 1726853665.63298: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a1021e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a0b7bf0> <<< 30583 1726853665.63467: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 30583 1726853665.63501: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa93a102300> <<< 30583 1726853665.63769: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_hzcj93_r/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 30583 1726853665.63893: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.63937: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 30583 1726853665.63941: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 30583 1726853665.63981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 30583 1726853665.64073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 30583 1726853665.64099: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939fd2f90> import '_typing' # <<< 30583 1726853665.64310: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939fb1e80> <<< 30583 1726853665.64313: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939fb1040> # zipimport: zlib available <<< 30583 1726853665.64383: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.64386: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 30583 1726853665.64397: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.65788: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.66981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939fd0e60> <<< 30583 1726853665.66985: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853665.67009: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 30583 1726853665.67045: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 30583 1726853665.67058: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 30583 1726853665.67084: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a006930> <<< 30583 1726853665.67154: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a006720> <<< 30583 1726853665.67182: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a006030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 30583 1726853665.67326: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a006a50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939fd3c20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a0076b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a0078f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 30583 1726853665.67638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a007e30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939929c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853665.67641: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93992b3b0> <<< 30583 1726853665.67644: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93992c1d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 30583 1726853665.67657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 30583 1726853665.67680: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93992d370> <<< 30583 1726853665.67764: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 30583 1726853665.67817: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93992fe00> <<< 30583 1726853665.67863: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939fb3080> <<< 30583 1726853665.67889: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93992e0c0> <<< 30583 1726853665.67980: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 30583 1726853665.67983: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 30583 1726853665.68098: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 30583 1726853665.68115: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 30583 1726853665.68194: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939937d10> import '_tokenize' # <<< 30583 1726853665.68425: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9399367e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939936540> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939936ab0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93992e5d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93997bfb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93997c0b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 30583 1726853665.68534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93997db80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93997d940> <<< 30583 1726853665.68538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 30583 1726853665.68553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 30583 1726853665.68630: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939980140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93997e240> <<< 30583 1726853665.68653: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 30583 1726853665.68702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 30583 1726853665.68786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 30583 1726853665.68790: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939983800> <<< 30583 1726853665.68887: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9399802f0> <<< 30583 1726853665.68974: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9399846e0> <<< 30583 1726853665.68986: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9399849b0> <<< 30583 1726853665.69025: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939984a40> <<< 30583 1726853665.69057: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93997c2c0> <<< 30583 1726853665.69081: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 30583 1726853665.69105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 30583 1726853665.69130: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853665.69164: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939810260> <<< 30583 1726853665.69318: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853665.69322: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939811280> <<< 30583 1726853665.69340: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9399869f0> <<< 30583 1726853665.69418: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939987da0> <<< 30583 1726853665.69421: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939986630> # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.69424: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 30583 1726853665.69515: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.69633: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.69636: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 30583 1726853665.69674: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 30583 1726853665.69793: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.69906: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.70463: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.71118: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 30583 1726853665.71132: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 30583 1726853665.71162: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939815520> <<< 30583 1726853665.71213: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 30583 1726853665.71235: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398163f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398113a0> <<< 30583 1726853665.71388: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 30583 1726853665.71401: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 30583 1726853665.71504: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.71658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 30583 1726853665.71703: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939816ab0> # zipimport: zlib available <<< 30583 1726853665.72150: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.72611: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.72693: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.72759: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 30583 1726853665.72774: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.72807: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.72980: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 30583 1726853665.73003: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.73037: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 30583 1726853665.73049: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.73091: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.73127: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 30583 1726853665.73138: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.73367: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.73601: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 30583 1726853665.73677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 30583 1726853665.73698: stdout chunk (state=3): >>>import '_ast' # <<< 30583 1726853665.73928: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398176b0> # zipimport: zlib available <<< 30583 1726853665.73936: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.73940: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 30583 1726853665.73961: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.73997: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.74034: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 30583 1726853665.74110: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.74130: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.74261: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 30583 1726853665.74313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853665.74409: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939822000> <<< 30583 1726853665.74441: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93981d940> <<< 30583 1726853665.74560: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.74617: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.74653: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.74697: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853665.74719: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 30583 1726853665.74750: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 30583 1726853665.74827: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 30583 1726853665.75076: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 30583 1726853665.75120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 30583 1726853665.75123: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93990a990> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9399fe690> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398220f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398170b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 30583 1726853665.75173: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 30583 1726853665.75209: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 30583 1726853665.75228: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.75275: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.75342: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.75363: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.75384: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.75494: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.75542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 30583 1726853665.75625: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.75695: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.75720: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.75765: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 30583 1726853665.75945: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.76118: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.76158: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.76216: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853665.76315: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 30583 1726853665.76345: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398b6030> <<< 30583 1726853665.76366: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 30583 1726853665.76402: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 30583 1726853665.76512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 30583 1726853665.76580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9394b7fe0> <<< 30583 1726853665.76584: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9394bc350> <<< 30583 1726853665.76705: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93989f2f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398b6ba0> <<< 30583 1726853665.76714: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398b4740> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398b42c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 30583 1726853665.76748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 30583 1726853665.76790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 30583 1726853665.76868: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9394bf3b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9394bec60> <<< 30583 1726853665.76878: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9394bee40> <<< 30583 1726853665.76964: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9394be090> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 30583 1726853665.77073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 30583 1726853665.77077: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9394bf590> <<< 30583 1726853665.77079: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 30583 1726853665.77287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 30583 1726853665.77319: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939522030> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9394bfcb0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398b57f0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.77376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 30583 1726853665.77392: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.77440: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.77488: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 30583 1726853665.77530: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 30583 1726853665.77546: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.77563: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.77629: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 30583 1726853665.77706: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.77709: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 30583 1726853665.77757: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.77802: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 30583 1726853665.77812: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.77866: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.78051: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.78058: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.78087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 30583 1726853665.78534: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.78984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 30583 1726853665.78999: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.79057: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.79102: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.79182: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 30583 1726853665.79196: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.79222: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.79274: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 30583 1726853665.79301: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.79416: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.79419: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.79450: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 30583 1726853665.79466: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.79520: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 30583 1726853665.79614: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.79617: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.79712: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 30583 1726853665.79744: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9395222d0> <<< 30583 1726853665.79761: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 30583 1726853665.79797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 30583 1726853665.79941: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939522db0> import 'ansible.module_utils.facts.system.local' # <<< 30583 1726853665.79953: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.80075: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.80079: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 30583 1726853665.80144: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.80168: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.80266: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 30583 1726853665.80505: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 30583 1726853665.80563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 30583 1726853665.80638: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853665.80705: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93955a300> <<< 30583 1726853665.80895: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93954a030> import 'ansible.module_utils.facts.system.python' # <<< 30583 1726853665.80911: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.80968: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.81088: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 30583 1726853665.81091: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.81114: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.81258: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.81373: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.81476: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 30583 1726853665.81515: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.81572: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 30583 1726853665.81585: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.81608: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.81663: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 30583 1726853665.81693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 30583 1726853665.81727: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93956dc70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93956f5f0> import 'ansible.module_utils.facts.system.user' # <<< 30583 1726853665.81820: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.81860: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 30583 1726853665.81875: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.82030: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.82186: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 30583 1726853665.82199: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.82289: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.82389: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.82427: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.82585: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 30583 1726853665.82588: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.82590: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.82684: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.82829: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 30583 1726853665.82847: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.82965: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.83090: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 30583 1726853665.83131: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.83228: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.83754: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.84286: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 30583 1726853665.84304: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.84477: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.84528: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 30583 1726853665.84624: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.84720: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 30583 1726853665.84753: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.84975: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.85062: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 30583 1726853665.85080: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 30583 1726853665.85167: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.85172: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 30583 1726853665.85195: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.85282: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.85383: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.85591: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.85799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 30583 1726853665.86058: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.86062: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 30583 1726853665.86064: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.86098: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # <<< 30583 1726853665.86125: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.86131: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.86295: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.86299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 30583 1726853665.86301: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.86364: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.86423: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 30583 1726853665.86429: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.86705: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.86965: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 30583 1726853665.86973: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.87068: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.87194: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.87277: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 30583 1726853665.87304: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.87334: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 30583 1726853665.87340: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.87433: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.87522: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 30583 1726853665.87592: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 30583 1726853665.87631: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.87652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 30583 1726853665.87668: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.87742: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30583 1726853665.87749: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.87804: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.87931: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.88178: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 30583 1726853665.88200: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 30583 1726853665.88286: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.88482: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 30583 1726853665.88537: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.88603: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 30583 1726853665.88641: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.88755: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 30583 1726853665.88853: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.88992: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 30583 1726853665.88995: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.89058: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 30583 1726853665.89064: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 30583 1726853665.89147: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853665.89378: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 30583 1726853665.89416: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 30583 1726853665.89422: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939306600> <<< 30583 1726853665.89476: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939305310> <<< 30583 1726853665.89485: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9392fff80> <<< 30583 1726853666.01521: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93934f560> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93934cfe0> <<< 30583 1726853666.01526: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 30583 1726853666.01529: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853666.01601: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93934e8d0> <<< 30583 1726853666.01605: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93934e030> <<< 30583 1726853666.02035: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 30583 1726853666.26580: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-197", "ansible_nodename": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2134955d8b5184190900489dab957f", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 60520 10.31.9.197 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 60520 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2952, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 579, "free": 2952}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_uuid": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 876, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789990912, "block_size": 4096, "block_total": 65519099, "block_available": 63913572, "block_used": 1605527, "inode_total": 131070960, "inode_available": 131029064, "inode_used": 41896, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "26", "epoch": "1726853666", "epoch_int": "1726853666", "date": "2024-09-20", "time": "13:34:26", "iso8601_micro": "2024-09-20T17:34:26.218087Z", "iso8601": "2024-09-20T17:34:26Z", "iso8601_basic": "20240920T133426218087", "iso8601_basic_short": "20240920T133426", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDiy4Yen7eiWP0/hmH4/5WHzI91c8NPRAJCku4Kk63/nAM2/HDHVpCGbs8kPnAcpJ95BGnY2AZ50i/GjByh6rqN4q0QNajZqOQdMrkomTRQGFsaoQTUzu+Wt7NYtajPseEV2zJTYbIlIC8H5nwTib7SkZscdc1iTw0saFFpV/aB+l5BDLfOe5EeE772aMDPUwKIw9RVy45e9Dl7uEv/Ez5XL/ZsZ8K0iZ4v2/Ebj39j+tw5M9hEjzRp4dqgv4FTXaFf2TvCql8dulUOPsjMu2MIvIfB4FbPNXrGKPKbzkjxWn4r+wUuvMPr4zoIJieVXFTR6ozZdzis6d3WFGAgZgX3ns+ULgR+lp0ZvHZb2amOGE8aM1TdwnDCeanweLvXk4zxXrpg0T4bTmQwKkDtd0DFml2CkWe4615TK07c49NoApmnEgPdztwxtraghMO72UOZkRBgUDB5GKSc202pCChA/GqiwfaUPdjS4LyUdkhgYAUniLPI2FRsZg4+EpoMZgs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAviMKS0iYCdMhDNjaRFlzVurOd6RVFe0VKYVOOZJko3KaULgIYAaS/l/1rRBz1963986hrDhKrLwmMRxr85S4Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAGtlq4ktcSkdXJkETJjSEIO/6xbcTDcVVefyj1D7mpG", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.736328125, "5m": 0.63525390625, "15m": 0.36962890625}, "ansible_local": {}, "ansible_fips": false, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10bc:daff:fe29:a445", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.197"], "ansible_all_ipv6_addresses": ["fe80::10bc:daff:fe29:a445"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.197", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10bc:daff:fe29:a445"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30583 1726853666.27141: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread<<< 30583 1726853666.27146: stdout chunk (state=3): >>> # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools <<< 30583 1726853666.27153: stdout chunk (state=3): >>># cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib <<< 30583 1726853666.27159: stdout chunk (state=3): >>># cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437<<< 30583 1726853666.27248: stdout chunk (state=3): >>> # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters <<< 30583 1726853666.27256: stdout chunk (state=3): >>># destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue <<< 30583 1726853666.27343: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat <<< 30583 1726853666.27386: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi <<< 30583 1726853666.27409: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 30583 1726853666.27933: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib <<< 30583 1726853666.28004: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 30583 1726853666.28008: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 30583 1726853666.28115: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 30583 1726853666.28119: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 30583 1726853666.28179: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 30583 1726853666.28213: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 30583 1726853666.28273: stdout chunk (state=3): >>># destroy _ssl <<< 30583 1726853666.28319: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 30583 1726853666.28374: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 30583 1726853666.28380: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 30583 1726853666.28663: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 30583 1726853666.28667: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 30583 1726853666.28669: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 30583 1726853666.28854: stdout chunk (state=3): >>># destroy sys.monitoring <<< 30583 1726853666.28888: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 30583 1726853666.28914: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 30583 1726853666.29102: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 30583 1726853666.29140: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 30583 1726853666.29165: stdout chunk (state=3): >>># destroy time <<< 30583 1726853666.29227: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 30583 1726853666.29278: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 30583 1726853666.29296: stdout chunk (state=3): >>># clear sys.audit hooks <<< 30583 1726853666.29908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853666.29940: stdout chunk (state=3): >>><<< 30583 1726853666.29943: stderr chunk (state=3): >>><<< 30583 1726853666.30213: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a4e84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a4b7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a4eaa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a299130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a299fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2d7dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2d7fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a30f800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a30fe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2efaa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2ed1c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2d4f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a32f6e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a32e300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2ee060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2d6e70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a3647a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2d4200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a364c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a364b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a364ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a2d2d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a3655b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a365280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a3664b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a37c680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a37dd30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a37ebd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a37f230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a37e120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a37fcb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a37f3e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a366450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a08bbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a0b4710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a0b4470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a0b46b0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a0b4fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a0b5910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a0b4890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a089d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a0b6cc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a0b5790> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a366ba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a0e3020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a1033e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a164200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a166960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a164320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a1311f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939f6d2e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a1021e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a0b7bf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa93a102300> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_hzcj93_r/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939fd2f90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939fb1e80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939fb1040> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939fd0e60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a006930> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a006720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a006030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a006a50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939fd3c20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a0076b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93a0078f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93a007e30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939929c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93992b3b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93992c1d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93992d370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93992fe00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939fb3080> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93992e0c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939937d10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9399367e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939936540> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939936ab0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93992e5d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93997bfb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93997c0b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93997db80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93997d940> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939980140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93997e240> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939983800> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9399802f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9399846e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9399849b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939984a40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93997c2c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939810260> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939811280> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9399869f0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939987da0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939986630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939815520> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398163f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398113a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939816ab0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398176b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939822000> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93981d940> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93990a990> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9399fe690> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398220f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398170b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398b6030> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9394b7fe0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9394bc350> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93989f2f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398b6ba0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398b4740> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398b42c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9394bf3b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9394bec60> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9394bee40> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9394be090> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9394bf590> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939522030> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9394bfcb0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9398b57f0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9395222d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939522db0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93955a300> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93954a030> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa93956dc70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93956f5f0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa939306600> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa939305310> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9392fff80> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93934f560> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93934cfe0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93934e8d0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa93934e030> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-197", "ansible_nodename": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2134955d8b5184190900489dab957f", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 60520 10.31.9.197 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 60520 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2952, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 579, "free": 2952}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_uuid": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 876, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789990912, "block_size": 4096, "block_total": 65519099, "block_available": 63913572, "block_used": 1605527, "inode_total": 131070960, "inode_available": 131029064, "inode_used": 41896, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "26", "epoch": "1726853666", "epoch_int": "1726853666", "date": "2024-09-20", "time": "13:34:26", "iso8601_micro": "2024-09-20T17:34:26.218087Z", "iso8601": "2024-09-20T17:34:26Z", "iso8601_basic": "20240920T133426218087", "iso8601_basic_short": "20240920T133426", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDiy4Yen7eiWP0/hmH4/5WHzI91c8NPRAJCku4Kk63/nAM2/HDHVpCGbs8kPnAcpJ95BGnY2AZ50i/GjByh6rqN4q0QNajZqOQdMrkomTRQGFsaoQTUzu+Wt7NYtajPseEV2zJTYbIlIC8H5nwTib7SkZscdc1iTw0saFFpV/aB+l5BDLfOe5EeE772aMDPUwKIw9RVy45e9Dl7uEv/Ez5XL/ZsZ8K0iZ4v2/Ebj39j+tw5M9hEjzRp4dqgv4FTXaFf2TvCql8dulUOPsjMu2MIvIfB4FbPNXrGKPKbzkjxWn4r+wUuvMPr4zoIJieVXFTR6ozZdzis6d3WFGAgZgX3ns+ULgR+lp0ZvHZb2amOGE8aM1TdwnDCeanweLvXk4zxXrpg0T4bTmQwKkDtd0DFml2CkWe4615TK07c49NoApmnEgPdztwxtraghMO72UOZkRBgUDB5GKSc202pCChA/GqiwfaUPdjS4LyUdkhgYAUniLPI2FRsZg4+EpoMZgs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAviMKS0iYCdMhDNjaRFlzVurOd6RVFe0VKYVOOZJko3KaULgIYAaS/l/1rRBz1963986hrDhKrLwmMRxr85S4Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAGtlq4ktcSkdXJkETJjSEIO/6xbcTDcVVefyj1D7mpG", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.736328125, "5m": 0.63525390625, "15m": 0.36962890625}, "ansible_local": {}, "ansible_fips": false, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10bc:daff:fe29:a445", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.197"], "ansible_all_ipv6_addresses": ["fe80::10bc:daff:fe29:a445"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.197", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10bc:daff:fe29:a445"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 30583 1726853666.31716: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853666.31719: _low_level_execute_command(): starting 30583 1726853666.31722: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853664.7564113-30602-205863084595615/ > /dev/null 2>&1 && sleep 0' 30583 1726853666.32678: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853666.32694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853666.32983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853666.33084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 30583 1726853666.35866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853666.35870: stdout chunk (state=3): >>><<< 30583 1726853666.35880: stderr chunk (state=3): >>><<< 30583 1726853666.35899: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 30583 1726853666.35907: handler run complete 30583 1726853666.36031: variable 'ansible_facts' from source: unknown 30583 1726853666.36141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853666.36488: variable 'ansible_facts' from source: unknown 30583 1726853666.36581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853666.36714: attempt loop complete, returning result 30583 1726853666.36717: _execute() done 30583 1726853666.36720: dumping result to json 30583 1726853666.36763: done dumping result, returning 30583 1726853666.36774: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [02083763-bbaf-05ea-abc5-00000000001b] 30583 1726853666.36779: sending task result for task 02083763-bbaf-05ea-abc5-00000000001b 30583 1726853666.37322: done sending task result for task 02083763-bbaf-05ea-abc5-00000000001b 30583 1726853666.37325: WORKER PROCESS EXITING ok: [managed_node2] 30583 1726853666.37637: no more pending results, returning what we have 30583 1726853666.37640: results queue empty 30583 1726853666.37640: checking for any_errors_fatal 30583 1726853666.37642: done checking for any_errors_fatal 30583 1726853666.37642: checking for max_fail_percentage 30583 1726853666.37644: done checking for max_fail_percentage 30583 1726853666.37644: checking to see if all hosts have failed and the running result is not ok 30583 1726853666.37645: done checking to see if all hosts have failed 30583 1726853666.37646: getting the remaining hosts for this loop 30583 1726853666.37647: done getting the remaining hosts for this loop 30583 1726853666.37650: getting the next task for host managed_node2 30583 1726853666.37656: done getting next task for host managed_node2 30583 1726853666.37658: ^ task is: TASK: meta (flush_handlers) 30583 1726853666.37659: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853666.37663: getting variables 30583 1726853666.37664: in VariableManager get_vars() 30583 1726853666.37685: Calling all_inventory to load vars for managed_node2 30583 1726853666.37688: Calling groups_inventory to load vars for managed_node2 30583 1726853666.37691: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853666.37700: Calling all_plugins_play to load vars for managed_node2 30583 1726853666.37709: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853666.37712: Calling groups_plugins_play to load vars for managed_node2 30583 1726853666.37909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853666.38176: done with get_vars() 30583 1726853666.38187: done getting variables 30583 1726853666.38256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 30583 1726853666.38321: in VariableManager get_vars() 30583 1726853666.38340: Calling all_inventory to load vars for managed_node2 30583 1726853666.38342: Calling groups_inventory to load vars for managed_node2 30583 1726853666.38345: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853666.38349: Calling all_plugins_play to load vars for managed_node2 30583 1726853666.38351: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853666.38354: Calling groups_plugins_play to load vars for managed_node2 30583 1726853666.38523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853666.38842: done with get_vars() 30583 1726853666.38856: done queuing things up, now waiting for results queue to drain 30583 1726853666.38858: results queue empty 30583 1726853666.38859: checking for any_errors_fatal 30583 1726853666.38861: done checking for any_errors_fatal 30583 1726853666.38862: checking for max_fail_percentage 30583 1726853666.38863: done checking for max_fail_percentage 30583 1726853666.38863: checking to see if all hosts have failed and the running result is not ok 30583 1726853666.38868: done checking to see if all hosts have failed 30583 1726853666.38869: getting the remaining hosts for this loop 30583 1726853666.38870: done getting the remaining hosts for this loop 30583 1726853666.38913: getting the next task for host managed_node2 30583 1726853666.38918: done getting next task for host managed_node2 30583 1726853666.38921: ^ task is: TASK: Include the task 'el_repo_setup.yml' 30583 1726853666.38923: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853666.38926: getting variables 30583 1726853666.38927: in VariableManager get_vars() 30583 1726853666.38936: Calling all_inventory to load vars for managed_node2 30583 1726853666.38938: Calling groups_inventory to load vars for managed_node2 30583 1726853666.38940: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853666.38945: Calling all_plugins_play to load vars for managed_node2 30583 1726853666.38947: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853666.38950: Calling groups_plugins_play to load vars for managed_node2 30583 1726853666.39109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853666.39327: done with get_vars() 30583 1726853666.39345: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:11 Friday 20 September 2024 13:34:26 -0400 (0:00:01.701) 0:00:01.731 ****** 30583 1726853666.39438: entering _queue_task() for managed_node2/include_tasks 30583 1726853666.39440: Creating lock for include_tasks 30583 1726853666.40002: worker is 1 (out of 1 available) 30583 1726853666.40009: exiting _queue_task() for managed_node2/include_tasks 30583 1726853666.40019: done queuing things up, now waiting for results queue to drain 30583 1726853666.40021: waiting for pending results... 30583 1726853666.40099: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 30583 1726853666.40218: in run() - task 02083763-bbaf-05ea-abc5-000000000006 30583 1726853666.40265: variable 'ansible_search_path' from source: unknown 30583 1726853666.40321: calling self._execute() 30583 1726853666.40590: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853666.40594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853666.40597: variable 'omit' from source: magic vars 30583 1726853666.40599: _execute() done 30583 1726853666.40601: dumping result to json 30583 1726853666.40604: done dumping result, returning 30583 1726853666.40606: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-05ea-abc5-000000000006] 30583 1726853666.40608: sending task result for task 02083763-bbaf-05ea-abc5-000000000006 30583 1726853666.40690: done sending task result for task 02083763-bbaf-05ea-abc5-000000000006 30583 1726853666.40694: WORKER PROCESS EXITING 30583 1726853666.40735: no more pending results, returning what we have 30583 1726853666.40741: in VariableManager get_vars() 30583 1726853666.40775: Calling all_inventory to load vars for managed_node2 30583 1726853666.40778: Calling groups_inventory to load vars for managed_node2 30583 1726853666.40782: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853666.40802: Calling all_plugins_play to load vars for managed_node2 30583 1726853666.40806: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853666.40810: Calling groups_plugins_play to load vars for managed_node2 30583 1726853666.41081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853666.41459: done with get_vars() 30583 1726853666.41466: variable 'ansible_search_path' from source: unknown 30583 1726853666.41482: we have included files to process 30583 1726853666.41483: generating all_blocks data 30583 1726853666.41484: done generating all_blocks data 30583 1726853666.41485: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30583 1726853666.41486: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30583 1726853666.41489: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30583 1726853666.42525: in VariableManager get_vars() 30583 1726853666.42542: done with get_vars() 30583 1726853666.42554: done processing included file 30583 1726853666.42557: iterating over new_blocks loaded from include file 30583 1726853666.42559: in VariableManager get_vars() 30583 1726853666.42567: done with get_vars() 30583 1726853666.42568: filtering new block on tags 30583 1726853666.42588: done filtering new block on tags 30583 1726853666.42591: in VariableManager get_vars() 30583 1726853666.42601: done with get_vars() 30583 1726853666.42602: filtering new block on tags 30583 1726853666.42616: done filtering new block on tags 30583 1726853666.42618: in VariableManager get_vars() 30583 1726853666.42627: done with get_vars() 30583 1726853666.42629: filtering new block on tags 30583 1726853666.42641: done filtering new block on tags 30583 1726853666.42643: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 30583 1726853666.42649: extending task lists for all hosts with included blocks 30583 1726853666.42705: done extending task lists 30583 1726853666.42706: done processing included files 30583 1726853666.42707: results queue empty 30583 1726853666.42708: checking for any_errors_fatal 30583 1726853666.42709: done checking for any_errors_fatal 30583 1726853666.42710: checking for max_fail_percentage 30583 1726853666.42711: done checking for max_fail_percentage 30583 1726853666.42712: checking to see if all hosts have failed and the running result is not ok 30583 1726853666.42712: done checking to see if all hosts have failed 30583 1726853666.42713: getting the remaining hosts for this loop 30583 1726853666.42714: done getting the remaining hosts for this loop 30583 1726853666.42716: getting the next task for host managed_node2 30583 1726853666.42720: done getting next task for host managed_node2 30583 1726853666.42722: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 30583 1726853666.42724: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853666.42726: getting variables 30583 1726853666.42727: in VariableManager get_vars() 30583 1726853666.42741: Calling all_inventory to load vars for managed_node2 30583 1726853666.42743: Calling groups_inventory to load vars for managed_node2 30583 1726853666.42745: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853666.42750: Calling all_plugins_play to load vars for managed_node2 30583 1726853666.42753: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853666.42757: Calling groups_plugins_play to load vars for managed_node2 30583 1726853666.42910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853666.43099: done with get_vars() 30583 1726853666.43111: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:34:26 -0400 (0:00:00.037) 0:00:01.769 ****** 30583 1726853666.43190: entering _queue_task() for managed_node2/setup 30583 1726853666.43594: worker is 1 (out of 1 available) 30583 1726853666.43605: exiting _queue_task() for managed_node2/setup 30583 1726853666.43617: done queuing things up, now waiting for results queue to drain 30583 1726853666.43618: waiting for pending results... 30583 1726853666.43880: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 30583 1726853666.43980: in run() - task 02083763-bbaf-05ea-abc5-00000000002c 30583 1726853666.43996: variable 'ansible_search_path' from source: unknown 30583 1726853666.44003: variable 'ansible_search_path' from source: unknown 30583 1726853666.44043: calling self._execute() 30583 1726853666.44125: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853666.44276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853666.44279: variable 'omit' from source: magic vars 30583 1726853666.44647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853666.46813: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853666.46881: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853666.46919: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853666.47190: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853666.47223: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853666.47338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853666.47366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853666.47399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853666.47450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853666.47541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853666.47660: variable 'ansible_facts' from source: unknown 30583 1726853666.47749: variable 'network_test_required_facts' from source: task vars 30583 1726853666.47798: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 30583 1726853666.47810: variable 'omit' from source: magic vars 30583 1726853666.47850: variable 'omit' from source: magic vars 30583 1726853666.47896: variable 'omit' from source: magic vars 30583 1726853666.47925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853666.47957: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853666.48003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853666.48025: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853666.48039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853666.48274: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853666.48278: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853666.48280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853666.48282: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853666.48284: Set connection var ansible_timeout to 10 30583 1726853666.48286: Set connection var ansible_connection to ssh 30583 1726853666.48288: Set connection var ansible_shell_executable to /bin/sh 30583 1726853666.48290: Set connection var ansible_shell_type to sh 30583 1726853666.48292: Set connection var ansible_pipelining to False 30583 1726853666.48294: variable 'ansible_shell_executable' from source: unknown 30583 1726853666.48296: variable 'ansible_connection' from source: unknown 30583 1726853666.48297: variable 'ansible_module_compression' from source: unknown 30583 1726853666.48299: variable 'ansible_shell_type' from source: unknown 30583 1726853666.48301: variable 'ansible_shell_executable' from source: unknown 30583 1726853666.48302: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853666.48304: variable 'ansible_pipelining' from source: unknown 30583 1726853666.48306: variable 'ansible_timeout' from source: unknown 30583 1726853666.48308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853666.48476: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853666.48493: variable 'omit' from source: magic vars 30583 1726853666.48503: starting attempt loop 30583 1726853666.48510: running the handler 30583 1726853666.48531: _low_level_execute_command(): starting 30583 1726853666.48544: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853666.49390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853666.49495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853666.49499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853666.49513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853666.49534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853666.49648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853666.52154: stdout chunk (state=3): >>>/root <<< 30583 1726853666.52253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853666.52259: stdout chunk (state=3): >>><<< 30583 1726853666.52400: stderr chunk (state=3): >>><<< 30583 1726853666.52404: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30583 1726853666.52415: _low_level_execute_command(): starting 30583 1726853666.52419: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076 `" && echo ansible-tmp-1726853666.5229979-30676-248925546046076="` echo /root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076 `" ) && sleep 0' 30583 1726853666.53046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853666.53083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853666.53099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853666.53111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853666.53220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853666.56198: stdout chunk (state=3): >>>ansible-tmp-1726853666.5229979-30676-248925546046076=/root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076 <<< 30583 1726853666.56402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853666.56406: stdout chunk (state=3): >>><<< 30583 1726853666.56409: stderr chunk (state=3): >>><<< 30583 1726853666.56427: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853666.5229979-30676-248925546046076=/root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30583 1726853666.56516: variable 'ansible_module_compression' from source: unknown 30583 1726853666.56560: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30583 1726853666.56639: variable 'ansible_facts' from source: unknown 30583 1726853666.56880: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076/AnsiballZ_setup.py 30583 1726853666.57144: Sending initial data 30583 1726853666.57147: Sent initial data (154 bytes) 30583 1726853666.57785: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853666.57848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853666.57867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853666.57885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853666.57998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853666.60432: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853666.60516: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853666.60611: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp1qqzi9r9 /root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076/AnsiballZ_setup.py <<< 30583 1726853666.60615: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076/AnsiballZ_setup.py" <<< 30583 1726853666.60693: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp1qqzi9r9" to remote "/root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076/AnsiballZ_setup.py" <<< 30583 1726853666.62528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853666.62542: stderr chunk (state=3): >>><<< 30583 1726853666.62545: stdout chunk (state=3): >>><<< 30583 1726853666.62572: done transferring module to remote 30583 1726853666.62586: _low_level_execute_command(): starting 30583 1726853666.62589: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076/ /root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076/AnsiballZ_setup.py && sleep 0' 30583 1726853666.63277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853666.63281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853666.63283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853666.63293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853666.63295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853666.63377: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853666.63380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853666.63388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853666.63407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853666.63515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853666.66282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853666.66287: stdout chunk (state=3): >>><<< 30583 1726853666.66289: stderr chunk (state=3): >>><<< 30583 1726853666.66399: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30583 1726853666.66403: _low_level_execute_command(): starting 30583 1726853666.66405: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076/AnsiballZ_setup.py && sleep 0' 30583 1726853666.67086: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853666.67103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853666.67119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853666.67235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853666.70521: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 30583 1726853666.70574: stdout chunk (state=3): >>>import _imp # builtin <<< 30583 1726853666.70611: stdout chunk (state=3): >>>import '_thread' # <<< 30583 1726853666.70636: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 30583 1726853666.70732: stdout chunk (state=3): >>>import '_io' # <<< 30583 1726853666.70753: stdout chunk (state=3): >>>import 'marshal' # <<< 30583 1726853666.70805: stdout chunk (state=3): >>>import 'posix' # <<< 30583 1726853666.70859: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 30583 1726853666.70882: stdout chunk (state=3): >>># installing zipimport hook <<< 30583 1726853666.71084: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 30583 1726853666.71122: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d1bc4d0> <<< 30583 1726853666.71140: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d18bb00> <<< 30583 1726853666.71178: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 30583 1726853666.71204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 30583 1726853666.71237: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d1bea50> <<< 30583 1726853666.71240: stdout chunk (state=3): >>>import '_signal' # <<< 30583 1726853666.71289: stdout chunk (state=3): >>>import '_abc' # <<< 30583 1726853666.71292: stdout chunk (state=3): >>>import 'abc' # <<< 30583 1726853666.71324: stdout chunk (state=3): >>>import 'io' # <<< 30583 1726853666.71376: stdout chunk (state=3): >>>import '_stat' # <<< 30583 1726853666.71386: stdout chunk (state=3): >>>import 'stat' # <<< 30583 1726853666.71506: stdout chunk (state=3): >>>import '_collections_abc' # <<< 30583 1726853666.71558: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 30583 1726853666.71607: stdout chunk (state=3): >>>import 'os' # <<< 30583 1726853666.71651: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 30583 1726853666.71654: stdout chunk (state=3): >>>Processing user site-packages <<< 30583 1726853666.71781: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d1cd130> <<< 30583 1726853666.71833: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 30583 1726853666.71859: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853666.71880: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d1cdfa0> <<< 30583 1726853666.71902: stdout chunk (state=3): >>>import 'site' # <<< 30583 1726853666.71953: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30583 1726853666.72595: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 30583 1726853666.72612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 30583 1726853666.72649: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 30583 1726853666.72669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853666.72883: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfabe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 30583 1726853666.73085: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfabf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 30583 1726853666.73126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 30583 1726853666.73145: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfe3890> <<< 30583 1726853666.73177: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 30583 1726853666.73204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfe3f20> <<< 30583 1726853666.73225: stdout chunk (state=3): >>>import '_collections' # <<< 30583 1726853666.73308: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfc3b60> <<< 30583 1726853666.73331: stdout chunk (state=3): >>>import '_functools' # <<< 30583 1726853666.73372: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfc1280> <<< 30583 1726853666.73519: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfa9040> <<< 30583 1726853666.73551: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 30583 1726853666.73601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 30583 1726853666.73611: stdout chunk (state=3): >>>import '_sre' # <<< 30583 1726853666.73657: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 30583 1726853666.73702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 30583 1726853666.73726: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 30583 1726853666.73754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 30583 1726853666.73802: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d003800> <<< 30583 1726853666.73992: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d002420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfc2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d000b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d038860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfa82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 30583 1726853666.74039: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853666.74043: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5d038d10> <<< 30583 1726853666.74064: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d038bc0> <<< 30583 1726853666.74115: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853666.74143: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853666.74162: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5d038f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfa6de0> <<< 30583 1726853666.74199: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 30583 1726853666.74208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853666.74241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 30583 1726853666.74295: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 30583 1726853666.74321: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d039610> <<< 30583 1726853666.74330: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d0392e0> <<< 30583 1726853666.74490: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d03a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 30583 1726853666.74504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 30583 1726853666.74536: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 30583 1726853666.74548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d050710> <<< 30583 1726853666.74576: stdout chunk (state=3): >>>import 'errno' # <<< 30583 1726853666.74607: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853666.74639: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5d051df0><<< 30583 1726853666.74647: stdout chunk (state=3): >>> <<< 30583 1726853666.74684: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 30583 1726853666.74704: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 30583 1726853666.74807: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d052c90> <<< 30583 1726853666.74842: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5d0532f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d0521e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 30583 1726853666.74845: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 30583 1726853666.75131: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5d053d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d0534a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d03a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5cd47c50> <<< 30583 1726853666.75139: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 30583 1726853666.75144: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5cd707a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd70500> <<< 30583 1726853666.75201: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5cd70710> <<< 30583 1726853666.75204: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 30583 1726853666.75279: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853666.75416: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5cd710d0> <<< 30583 1726853666.75568: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5cd71ac0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd70980> <<< 30583 1726853666.75649: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd45df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 30583 1726853666.75688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd72ea0> <<< 30583 1726853666.75746: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd71be0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d03ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 30583 1726853666.75827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853666.75831: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 30583 1726853666.75883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 30583 1726853666.75896: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd9b1d0> <<< 30583 1726853666.75972: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 30583 1726853666.75995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 30583 1726853666.76011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 30583 1726853666.76084: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cdbf590> <<< 30583 1726853666.76086: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 30583 1726853666.76110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 30583 1726853666.76173: stdout chunk (state=3): >>>import 'ntpath' # <<< 30583 1726853666.76594: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5ce202f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5ce22a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5ce203e0> <<< 30583 1726853666.76619: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cde5310> <<< 30583 1726853666.76658: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c7213a0> <<< 30583 1726853666.76704: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cdbe390> <<< 30583 1726853666.76726: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd73da0> <<< 30583 1726853666.77010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3d5cdbe990> <<< 30583 1726853666.77376: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_1hdo05h_/ansible_setup_payload.zip'<<< 30583 1726853666.77403: stdout chunk (state=3): >>> <<< 30583 1726853666.77470: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.77621: stdout chunk (state=3): >>># zipimport: zlib available<<< 30583 1726853666.77652: stdout chunk (state=3): >>> <<< 30583 1726853666.77673: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 30583 1726853666.77707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 30583 1726853666.77788: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 30583 1726853666.77916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc'<<< 30583 1726853666.77981: stdout chunk (state=3): >>> # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 30583 1726853666.78011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c78b080> import '_typing' # <<< 30583 1726853666.78078: stdout chunk (state=3): >>> <<< 30583 1726853666.78356: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c769f70> <<< 30583 1726853666.78387: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c769100> <<< 30583 1726853666.78400: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.78441: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available<<< 30583 1726853666.78483: stdout chunk (state=3): >>> # zipimport: zlib available <<< 30583 1726853666.78515: stdout chunk (state=3): >>># zipimport: zlib available<<< 30583 1726853666.78528: stdout chunk (state=3): >>> import 'ansible.module_utils' # <<< 30583 1726853666.78570: stdout chunk (state=3): >>> # zipimport: zlib available <<< 30583 1726853666.80265: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.81476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c788f20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853666.81525: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 30583 1726853666.81731: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 30583 1726853666.81781: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c7ba990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c7ba780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c7ba090> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c7baab0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c78bb00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c7bb740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c7bb980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 30583 1726853666.81860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 30583 1726853666.81938: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c7bbe90> import 'pwd' # <<< 30583 1726853666.81982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 30583 1726853666.82032: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c625c10> <<< 30583 1726853666.82054: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853666.82088: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c627800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 30583 1726853666.82161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c628200> <<< 30583 1726853666.82195: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 30583 1726853666.82242: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 30583 1726853666.82245: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c629100> <<< 30583 1726853666.82269: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 30583 1726853666.82330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 30583 1726853666.82365: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 30583 1726853666.82422: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c62be30> <<< 30583 1726853666.82468: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5cfa6ed0> <<< 30583 1726853666.82499: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c62a0f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 30583 1726853666.82581: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 30583 1726853666.82752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 30583 1726853666.82789: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c633d40> <<< 30583 1726853666.82836: stdout chunk (state=3): >>>import '_tokenize' # <<< 30583 1726853666.82932: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c632810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c632570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 30583 1726853666.83084: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c632ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c62a600> <<< 30583 1726853666.83127: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c677a10> <<< 30583 1726853666.83167: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c6781a0> <<< 30583 1726853666.83203: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 30583 1726853666.83275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853666.83288: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c679c10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c6799d0> <<< 30583 1726853666.83341: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 30583 1726853666.83509: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853666.83513: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c67c170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c67a300> <<< 30583 1726853666.83540: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 30583 1726853666.83610: stdout chunk (state=3): >>>import '_string' # <<< 30583 1726853666.83777: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c67f950> <<< 30583 1726853666.83894: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c67c320> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c680740> <<< 30583 1726853666.83928: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c6809b0> <<< 30583 1726853666.83983: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c680c20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c678380> <<< 30583 1726853666.84078: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 30583 1726853666.84188: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c50c3e0> <<< 30583 1726853666.84363: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853666.84379: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c50d610> <<< 30583 1726853666.84390: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c682b70> <<< 30583 1726853666.84420: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853666.84433: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c683ef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c6827b0> <<< 30583 1726853666.84483: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.84495: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 30583 1726853666.84655: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.84658: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.84808: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 30583 1726853666.84821: stdout chunk (state=3): >>> import 'ansible.module_utils.common' # <<< 30583 1726853666.84910: stdout chunk (state=3): >>> # zipimport: zlib available<<< 30583 1726853666.84914: stdout chunk (state=3): >>> <<< 30583 1726853666.84926: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available<<< 30583 1726853666.85046: stdout chunk (state=3): >>> <<< 30583 1726853666.85130: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.85788: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.85814: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.86384: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 30583 1726853666.86387: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 30583 1726853666.86389: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 30583 1726853666.86413: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 30583 1726853666.86429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853666.86502: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c511670> <<< 30583 1726853666.86579: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 30583 1726853666.86668: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c512420> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c50d730> import 'ansible.module_utils.compat.selinux' # <<< 30583 1726853666.86684: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.86714: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.86717: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 30583 1726853666.86725: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.86881: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.87100: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c5124e0> # zipimport: zlib available <<< 30583 1726853666.87547: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.88165: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 30583 1726853666.88193: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.88288: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 30583 1726853666.88313: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.88393: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 30583 1726853666.88410: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.88423: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 30583 1726853666.88497: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.88500: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.88516: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 30583 1726853666.88532: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.88768: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.89003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 30583 1726853666.89193: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c513500> # zipimport: zlib available <<< 30583 1726853666.89233: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.89366: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 30583 1726853666.89481: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 30583 1726853666.89519: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.89531: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.89592: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.89734: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853666.89812: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853666.89816: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c51e000> <<< 30583 1726853666.89845: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c518f80> <<< 30583 1726853666.89882: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 30583 1726853666.89911: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.90023: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30583 1726853666.90055: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.90176: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 30583 1726853666.90225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 30583 1726853666.90248: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 30583 1726853666.90342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 30583 1726853666.90460: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c606900> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c7e65d0> <<< 30583 1726853666.90494: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c51e090> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c6824b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853666.90517: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 30583 1726853666.90529: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 30583 1726853666.90579: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 30583 1726853666.90681: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 30583 1726853666.90775: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853666.90803: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.90890: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.91006: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 30583 1726853666.91252: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 30583 1726853666.91369: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.91688: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 30583 1726853666.91796: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c5ae270> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 30583 1726853666.91821: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 30583 1726853666.91870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 30583 1726853666.92014: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c16ff50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c174200> <<< 30583 1726853666.92086: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c59ad20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c5aede0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c5ac9e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c5ac470> <<< 30583 1726853666.92101: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 30583 1726853666.92168: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 30583 1726853666.92190: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 30583 1726853666.92212: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 30583 1726853666.92227: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 30583 1726853666.92332: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c177290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c176b40> <<< 30583 1726853666.92335: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c176d20> <<< 30583 1726853666.92337: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c175f70> <<< 30583 1726853666.92549: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 30583 1726853666.92553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 30583 1726853666.92556: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c177470> <<< 30583 1726853666.92558: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 30583 1726853666.92561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 30583 1726853666.92579: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c1d5f70> <<< 30583 1726853666.92604: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c177f50> <<< 30583 1726853666.92657: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c5ac620> import 'ansible.module_utils.facts.timeout' # <<< 30583 1726853666.92698: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 30583 1726853666.92817: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30583 1726853666.92824: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 30583 1726853666.92846: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.92900: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.92950: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 30583 1726853666.92968: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.92991: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 30583 1726853666.93029: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.93054: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 30583 1726853666.93072: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.93118: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.93169: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 30583 1726853666.93217: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.93236: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.93262: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 30583 1726853666.93412: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30583 1726853666.93416: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.93456: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.93629: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 30583 1726853666.93632: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 30583 1726853666.94020: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.94461: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 30583 1726853666.94543: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30583 1726853666.94576: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.94614: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.94665: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 30583 1726853666.94929: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 30583 1726853666.94932: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.94935: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 30583 1726853666.94937: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.94939: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.94964: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 30583 1726853666.94973: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.94999: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 30583 1726853666.95083: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.95099: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.95180: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 30583 1726853666.95200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 30583 1726853666.95398: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c1d7140> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c1d6ba0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 30583 1726853666.95466: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.95692: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 30583 1726853666.95695: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.95699: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.95721: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 30583 1726853666.95735: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.95797: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.95912: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 30583 1726853666.95966: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 30583 1726853666.96022: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 30583 1726853666.96239: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853666.96242: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c212270> <<< 30583 1726853666.96349: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c2030b0> <<< 30583 1726853666.96369: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 30583 1726853666.96425: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.96493: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 30583 1726853666.96674: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.96677: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.96680: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.96779: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.96932: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 30583 1726853666.96981: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.97013: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 30583 1726853666.97035: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.97067: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.97122: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 30583 1726853666.97179: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853666.97430: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c225dc0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c203260> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 30583 1726853666.97433: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.97436: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 30583 1726853666.97438: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.97491: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.97644: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 30583 1726853666.97650: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.97749: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.97875: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.97897: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.97933: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 30583 1726853666.97953: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 30583 1726853666.98020: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30583 1726853666.98277: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.98281: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 30583 1726853666.98295: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.98446: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.98530: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 30583 1726853666.98550: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.98670: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30583 1726853666.99183: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853666.99703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 30583 1726853666.99720: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.00095: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853667.00299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853667.00456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 30583 1726853667.00485: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 30583 1726853667.00503: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.00541: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.00591: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 30583 1726853667.00602: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.00794: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30583 1726853667.01066: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.01293: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 30583 1726853667.01319: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.01341: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 30583 1726853667.01356: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.01616: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853667.01682: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 30583 1726853667.01693: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.01748: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.01826: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 30583 1726853667.02188: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.02483: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 30583 1726853667.02517: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.02543: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 30583 1726853667.02562: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.02590: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.02821: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853667.02865: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 30583 1726853667.02902: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 30583 1726853667.02914: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.03019: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.03022: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 30583 1726853667.03025: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.03146: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853667.03364: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.03368: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853667.03460: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 30583 1726853667.03626: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.03824: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 30583 1726853667.03893: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.04002: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853667.04025: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 30583 1726853667.04042: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.04145: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.04209: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 30583 1726853667.04300: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853667.04399: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 30583 1726853667.04580: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.05466: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 30583 1726853667.05519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 30583 1726853667.05544: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 30583 1726853667.05573: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c0274a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c026180> <<< 30583 1726853667.05683: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c026240> <<< 30583 1726853667.05995: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 60520 10.31.9.197 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 60520 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": <<< 30583 1726853667.06097: stdout chunk (state=3): >>>"10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDiy4Yen7eiWP0/hmH4/5WHzI91c8NPRAJCku4Kk63/nAM2/HDHVpCGbs8kPnAcpJ95BGnY2AZ50i/GjByh6rqN4q0QNajZqOQdMrkomTRQGFsaoQTUzu+Wt7NYtajPseEV2zJTYbIlIC8H5nwTib7SkZscdc1iTw0saFFpV/aB+l5BDLfOe5EeE772aMDPUwKIw9RVy45e9Dl7uEv/Ez5XL/ZsZ8K0iZ4v2/Ebj39j+tw5M9hEjzRp4dqgv4FTXaFf2TvCql8dulUOPsjMu2MIvIfB4FbPNXrGKPKbzkjxWn4r+wUuvMPr4zoIJieVXFTR6ozZdzis6d3WFGAgZgX3ns+ULgR+lp0ZvHZb2amOGE8aM1TdwnDCeanweLvXk4zxXrpg0T4bTmQwKkDtd0DFml2CkWe4615TK07c49NoApmnEgPdztwxtraghMO72UOZkRBgUDB5GKSc202pCChA/GqiwfaUPdjS4LyUdkhgYAUniLPI2FRsZg4+EpoMZgs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAviMKS0iYCdMhDNjaRFlzVurOd6RVFe0VKYVOOZJko3KaULgIYAaS/l/1rRBz1963986hrDhKrLwmMRxr85S4Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAGtlq4ktcSkdXJkETJjSEIO/6xbcTDcVVefyj1D7mpG", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "27", "epoch": "1726853667", "epoch_int": "1726853667", "date": "2024-09-20", "time": "13:34:27", "iso8601_micro": "2024-09-20T17:34:27.053772Z", "iso8601": "2024-09-20T17:34:27Z", "iso8601_basic": "20240920T133427053772", "iso8601_basic_short": "20240920T133427", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-197", "ansible_nodename": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2134955d8b5184190900489dab957f", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30583 1726853667.06777: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 30583 1726853667.06780: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat <<< 30583 1726853667.06878: stdout chunk (state=3): >>># cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd <<< 30583 1726853667.06887: stdout chunk (state=3): >>># cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 30583 1726853667.06892: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl <<< 30583 1726853667.06895: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd<<< 30583 1726853667.06986: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd <<< 30583 1726853667.06990: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 30583 1726853667.07338: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 30583 1726853667.07497: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 30583 1726853667.07528: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 30583 1726853667.07548: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 30583 1726853667.07585: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 30583 1726853667.07603: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 30583 1726853667.07648: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 30583 1726853667.07674: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle <<< 30583 1726853667.07894: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 30583 1726853667.07977: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 30583 1726853667.07982: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 30583 1726853667.07997: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 30583 1726853667.08049: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 30583 1726853667.08088: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 30583 1726853667.08163: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 30583 1726853667.08277: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 30583 1726853667.08373: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 30583 1726853667.08415: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 30583 1726853667.08448: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 30583 1726853667.08588: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 30583 1726853667.08620: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 30583 1726853667.08657: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 30583 1726853667.08716: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 30583 1726853667.08757: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 30583 1726853667.08805: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 30583 1726853667.09160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853667.09451: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853667.09457: stdout chunk (state=3): >>><<< 30583 1726853667.09459: stderr chunk (state=3): >>><<< 30583 1726853667.09600: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d1bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d18bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d1bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d1cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d1cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfabe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfabf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfe3890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfe3f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfc3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfc1280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfa9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d003800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d002420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfc2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d000b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d038860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfa82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5d038d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d038bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5d038f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cfa6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d039610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d0392e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d03a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d050710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5d051df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d052c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5d0532f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d0521e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5d053d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d0534a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d03a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5cd47c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5cd707a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd70500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5cd70710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5cd710d0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5cd71ac0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd70980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd45df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd72ea0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd71be0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5d03ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd9b1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cdbf590> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5ce202f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5ce22a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5ce203e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cde5310> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c7213a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cdbe390> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5cd73da0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3d5cdbe990> # zipimport: found 103 names in '/tmp/ansible_setup_payload_1hdo05h_/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c78b080> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c769f70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c769100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c788f20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c7ba990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c7ba780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c7ba090> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c7baab0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c78bb00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c7bb740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c7bb980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c7bbe90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c625c10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c627800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c628200> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c629100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c62be30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5cfa6ed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c62a0f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c633d40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c632810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c632570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c632ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c62a600> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c677a10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c6781a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c679c10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c6799d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c67c170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c67a300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c67f950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c67c320> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c680740> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c6809b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c680c20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c678380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c50c3e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c50d610> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c682b70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c683ef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c6827b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c511670> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c512420> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c50d730> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c5124e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c513500> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c51e000> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c518f80> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c606900> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c7e65d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c51e090> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c6824b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c5ae270> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c16ff50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c174200> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c59ad20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c5aede0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c5ac9e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c5ac470> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c177290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c176b40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c176d20> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c175f70> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c177470> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c1d5f70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c177f50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c5ac620> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c1d7140> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c1d6ba0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c212270> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c2030b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c225dc0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c203260> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3d5c0274a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c026180> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3d5c026240> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 60520 10.31.9.197 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 60520 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDiy4Yen7eiWP0/hmH4/5WHzI91c8NPRAJCku4Kk63/nAM2/HDHVpCGbs8kPnAcpJ95BGnY2AZ50i/GjByh6rqN4q0QNajZqOQdMrkomTRQGFsaoQTUzu+Wt7NYtajPseEV2zJTYbIlIC8H5nwTib7SkZscdc1iTw0saFFpV/aB+l5BDLfOe5EeE772aMDPUwKIw9RVy45e9Dl7uEv/Ez5XL/ZsZ8K0iZ4v2/Ebj39j+tw5M9hEjzRp4dqgv4FTXaFf2TvCql8dulUOPsjMu2MIvIfB4FbPNXrGKPKbzkjxWn4r+wUuvMPr4zoIJieVXFTR6ozZdzis6d3WFGAgZgX3ns+ULgR+lp0ZvHZb2amOGE8aM1TdwnDCeanweLvXk4zxXrpg0T4bTmQwKkDtd0DFml2CkWe4615TK07c49NoApmnEgPdztwxtraghMO72UOZkRBgUDB5GKSc202pCChA/GqiwfaUPdjS4LyUdkhgYAUniLPI2FRsZg4+EpoMZgs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAviMKS0iYCdMhDNjaRFlzVurOd6RVFe0VKYVOOZJko3KaULgIYAaS/l/1rRBz1963986hrDhKrLwmMRxr85S4Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAGtlq4ktcSkdXJkETJjSEIO/6xbcTDcVVefyj1D7mpG", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "27", "epoch": "1726853667", "epoch_int": "1726853667", "date": "2024-09-20", "time": "13:34:27", "iso8601_micro": "2024-09-20T17:34:27.053772Z", "iso8601": "2024-09-20T17:34:27Z", "iso8601_basic": "20240920T133427053772", "iso8601_basic_short": "20240920T133427", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-197", "ansible_nodename": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2134955d8b5184190900489dab957f", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 30583 1726853667.11487: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853667.11490: _low_level_execute_command(): starting 30583 1726853667.11493: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853666.5229979-30676-248925546046076/ > /dev/null 2>&1 && sleep 0' 30583 1726853667.11820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853667.11823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853667.11826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853667.11828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853667.11883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853667.12276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853667.12397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853667.14404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853667.14409: stdout chunk (state=3): >>><<< 30583 1726853667.14412: stderr chunk (state=3): >>><<< 30583 1726853667.14526: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853667.14529: handler run complete 30583 1726853667.14782: variable 'ansible_facts' from source: unknown 30583 1726853667.14785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853667.15227: variable 'ansible_facts' from source: unknown 30583 1726853667.15289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853667.15481: attempt loop complete, returning result 30583 1726853667.15521: _execute() done 30583 1726853667.15530: dumping result to json 30583 1726853667.15588: done dumping result, returning 30583 1726853667.15602: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-05ea-abc5-00000000002c] 30583 1726853667.15612: sending task result for task 02083763-bbaf-05ea-abc5-00000000002c 30583 1726853667.15851: done sending task result for task 02083763-bbaf-05ea-abc5-00000000002c 30583 1726853667.16077: WORKER PROCESS EXITING ok: [managed_node2] 30583 1726853667.16180: no more pending results, returning what we have 30583 1726853667.16183: results queue empty 30583 1726853667.16184: checking for any_errors_fatal 30583 1726853667.16185: done checking for any_errors_fatal 30583 1726853667.16186: checking for max_fail_percentage 30583 1726853667.16188: done checking for max_fail_percentage 30583 1726853667.16188: checking to see if all hosts have failed and the running result is not ok 30583 1726853667.16189: done checking to see if all hosts have failed 30583 1726853667.16190: getting the remaining hosts for this loop 30583 1726853667.16191: done getting the remaining hosts for this loop 30583 1726853667.16194: getting the next task for host managed_node2 30583 1726853667.16202: done getting next task for host managed_node2 30583 1726853667.16204: ^ task is: TASK: Check if system is ostree 30583 1726853667.16207: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853667.16210: getting variables 30583 1726853667.16211: in VariableManager get_vars() 30583 1726853667.16237: Calling all_inventory to load vars for managed_node2 30583 1726853667.16240: Calling groups_inventory to load vars for managed_node2 30583 1726853667.16243: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853667.16252: Calling all_plugins_play to load vars for managed_node2 30583 1726853667.16257: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853667.16260: Calling groups_plugins_play to load vars for managed_node2 30583 1726853667.16535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853667.16741: done with get_vars() 30583 1726853667.16751: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:34:27 -0400 (0:00:00.736) 0:00:02.505 ****** 30583 1726853667.16844: entering _queue_task() for managed_node2/stat 30583 1726853667.17105: worker is 1 (out of 1 available) 30583 1726853667.17118: exiting _queue_task() for managed_node2/stat 30583 1726853667.17129: done queuing things up, now waiting for results queue to drain 30583 1726853667.17131: waiting for pending results... 30583 1726853667.17406: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 30583 1726853667.17613: in run() - task 02083763-bbaf-05ea-abc5-00000000002e 30583 1726853667.17617: variable 'ansible_search_path' from source: unknown 30583 1726853667.17620: variable 'ansible_search_path' from source: unknown 30583 1726853667.17679: calling self._execute() 30583 1726853667.17743: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853667.17752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853667.17767: variable 'omit' from source: magic vars 30583 1726853667.18349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853667.18521: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853667.18576: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853667.18614: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853667.18679: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853667.18768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853667.18806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853667.18834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853667.18865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853667.19026: Evaluated conditional (not __network_is_ostree is defined): True 30583 1726853667.19037: variable 'omit' from source: magic vars 30583 1726853667.19101: variable 'omit' from source: magic vars 30583 1726853667.19168: variable 'omit' from source: magic vars 30583 1726853667.19198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853667.19322: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853667.19341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853667.19363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853667.19381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853667.19429: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853667.19438: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853667.19441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853667.19677: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853667.19680: Set connection var ansible_timeout to 10 30583 1726853667.19683: Set connection var ansible_connection to ssh 30583 1726853667.19685: Set connection var ansible_shell_executable to /bin/sh 30583 1726853667.19687: Set connection var ansible_shell_type to sh 30583 1726853667.19689: Set connection var ansible_pipelining to False 30583 1726853667.19713: variable 'ansible_shell_executable' from source: unknown 30583 1726853667.19719: variable 'ansible_connection' from source: unknown 30583 1726853667.19726: variable 'ansible_module_compression' from source: unknown 30583 1726853667.19732: variable 'ansible_shell_type' from source: unknown 30583 1726853667.19737: variable 'ansible_shell_executable' from source: unknown 30583 1726853667.19743: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853667.19754: variable 'ansible_pipelining' from source: unknown 30583 1726853667.19763: variable 'ansible_timeout' from source: unknown 30583 1726853667.19772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853667.20138: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853667.20577: variable 'omit' from source: magic vars 30583 1726853667.20580: starting attempt loop 30583 1726853667.20583: running the handler 30583 1726853667.20585: _low_level_execute_command(): starting 30583 1726853667.20587: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853667.21614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853667.21634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853667.21717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853667.21785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853667.21833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853667.21902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853667.24230: stdout chunk (state=3): >>>/root <<< 30583 1726853667.24407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853667.24411: stdout chunk (state=3): >>><<< 30583 1726853667.24416: stderr chunk (state=3): >>><<< 30583 1726853667.24635: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30583 1726853667.24648: _low_level_execute_command(): starting 30583 1726853667.24651: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177 `" && echo ansible-tmp-1726853667.2444422-30699-179357836274177="` echo /root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177 `" ) && sleep 0' 30583 1726853667.25888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853667.25951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853667.25977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853667.26069: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853667.26140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853667.26432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853667.26535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 30583 1726853667.29411: stdout chunk (state=3): >>>ansible-tmp-1726853667.2444422-30699-179357836274177=/root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177 <<< 30583 1726853667.29619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853667.29629: stdout chunk (state=3): >>><<< 30583 1726853667.29643: stderr chunk (state=3): >>><<< 30583 1726853667.29668: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853667.2444422-30699-179357836274177=/root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 30583 1726853667.29734: variable 'ansible_module_compression' from source: unknown 30583 1726853667.29801: ANSIBALLZ: Using lock for stat 30583 1726853667.29808: ANSIBALLZ: Acquiring lock 30583 1726853667.29815: ANSIBALLZ: Lock acquired: 139827455547856 30583 1726853667.29826: ANSIBALLZ: Creating module 30583 1726853667.47191: ANSIBALLZ: Writing module into payload 30583 1726853667.47329: ANSIBALLZ: Writing module 30583 1726853667.47406: ANSIBALLZ: Renaming module 30583 1726853667.47409: ANSIBALLZ: Done creating module 30583 1726853667.47420: variable 'ansible_facts' from source: unknown 30583 1726853667.47501: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177/AnsiballZ_stat.py 30583 1726853667.47753: Sending initial data 30583 1726853667.47760: Sent initial data (153 bytes) 30583 1726853667.48393: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853667.48437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853667.48461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853667.48481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853667.48596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853667.50703: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853667.50708: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853667.50790: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp9__3h4os /root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177/AnsiballZ_stat.py <<< 30583 1726853667.50794: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177/AnsiballZ_stat.py" <<< 30583 1726853667.50893: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp9__3h4os" to remote "/root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177/AnsiballZ_stat.py" <<< 30583 1726853667.52322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853667.52335: stdout chunk (state=3): >>><<< 30583 1726853667.52350: stderr chunk (state=3): >>><<< 30583 1726853667.52516: done transferring module to remote 30583 1726853667.52519: _low_level_execute_command(): starting 30583 1726853667.52521: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177/ /root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177/AnsiballZ_stat.py && sleep 0' 30583 1726853667.53146: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853667.53152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853667.53206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853667.53209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853667.53255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853667.53258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853667.53392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853667.56083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853667.56087: stdout chunk (state=3): >>><<< 30583 1726853667.56089: stderr chunk (state=3): >>><<< 30583 1726853667.56116: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30583 1726853667.56194: _low_level_execute_command(): starting 30583 1726853667.56197: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177/AnsiballZ_stat.py && sleep 0' 30583 1726853667.56788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853667.56813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853667.56829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853667.56842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853667.56952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853667.60080: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 30583 1726853667.60088: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 30583 1726853667.60104: stdout chunk (state=3): >>>import 'posix' # <<< 30583 1726853667.60107: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 30583 1726853667.60208: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 30583 1726853667.60212: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853667.60214: stdout chunk (state=3): >>>import '_codecs' # <<< 30583 1726853667.60388: stdout chunk (state=3): >>>import 'codecs' # <<< 30583 1726853667.60413: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c46184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c45e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c461aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 30583 1726853667.60444: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 30583 1726853667.60532: stdout chunk (state=3): >>>import '_collections_abc' # <<< 30583 1726853667.60558: stdout chunk (state=3): >>>import 'genericpath' # <<< 30583 1726853667.60698: stdout chunk (state=3): >>>import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c442d130> <<< 30583 1726853667.60757: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853667.60800: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c442dfa0> <<< 30583 1726853667.60803: stdout chunk (state=3): >>>import 'site' # <<< 30583 1726853667.60830: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30583 1726853667.61061: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 30583 1726853667.61084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 30583 1726853667.61286: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853667.61296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 30583 1726853667.61314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c446bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c446bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 30583 1726853667.61324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 30583 1726853667.61342: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 30583 1726853667.61402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853667.61412: stdout chunk (state=3): >>>import 'itertools' # <<< 30583 1726853667.61441: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 30583 1726853667.61462: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 30583 1726853667.61483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44a3ec0> <<< 30583 1726853667.61725: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4483b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4469070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 30583 1726853667.61729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 30583 1726853667.61731: stdout chunk (state=3): >>>import '_sre' # <<< 30583 1726853667.61749: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 30583 1726853667.61773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 30583 1726853667.61838: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 30583 1726853667.61842: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44c37d0> <<< 30583 1726853667.61846: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44c23f0> <<< 30583 1726853667.61874: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 30583 1726853667.61889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4482150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44c0bc0> <<< 30583 1726853667.61931: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 30583 1726853667.61965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44682f0> <<< 30583 1726853667.61969: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 30583 1726853667.62054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 30583 1726853667.62058: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853667.62211: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c44f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44f8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c44f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4466e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44f9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44fa540> <<< 30583 1726853667.62234: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 30583 1726853667.62260: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 30583 1726853667.62299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 30583 1726853667.62331: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4510740> <<< 30583 1726853667.62347: stdout chunk (state=3): >>>import 'errno' # <<< 30583 1726853667.62369: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853667.62400: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c4511e20> <<< 30583 1726853667.62403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 30583 1726853667.62422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 30583 1726853667.62492: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 30583 1726853667.62495: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4512cc0> <<< 30583 1726853667.62498: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c45132f0> <<< 30583 1726853667.62699: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4512210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c4513d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c45134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44fa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 30583 1726853667.62724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 30583 1726853667.62756: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c4297c50> <<< 30583 1726853667.62778: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 30583 1726853667.62795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 30583 1726853667.62818: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c42c07a0> <<< 30583 1726853667.62831: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c42c0500> <<< 30583 1726853667.62847: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c42c07d0> <<< 30583 1726853667.62875: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 30583 1726853667.62890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 30583 1726853667.62955: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853667.63086: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853667.63097: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c42c1100> <<< 30583 1726853667.63364: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c42c1af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c42c09b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4295df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 30583 1726853667.63367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 30583 1726853667.63370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c42c2f00> <<< 30583 1726853667.63374: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c42c1c40> <<< 30583 1726853667.63392: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44fac60> <<< 30583 1726853667.63414: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 30583 1726853667.63481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853667.63577: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 30583 1726853667.63579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 30583 1726853667.63581: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c42eb230> <<< 30583 1726853667.63617: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 30583 1726853667.63703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853667.63705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 30583 1726853667.63707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 30583 1726853667.63822: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c430f5f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 30583 1726853667.63829: stdout chunk (state=3): >>>import 'ntpath' # <<< 30583 1726853667.63860: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4370380> <<< 30583 1726853667.64075: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 30583 1726853667.64078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 30583 1726853667.64081: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 30583 1726853667.64082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 30583 1726853667.64084: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4372ae0> <<< 30583 1726853667.64365: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c43704a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4331370> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4175430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c430e3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c42c3e00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc7c430e750> <<< 30583 1726853667.64534: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_fd7n39tz/ansible_stat_payload.zip' # zipimport: zlib available <<< 30583 1726853667.64997: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41cb170> import '_typing' # <<< 30583 1726853667.65052: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41aa060> <<< 30583 1726853667.65078: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41a91c0> # zipimport: zlib available <<< 30583 1726853667.65084: stdout chunk (state=3): >>>import 'ansible' # <<< 30583 1726853667.65108: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30583 1726853667.65158: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 30583 1726853667.65161: stdout chunk (state=3): >>> # zipimport: zlib available <<< 30583 1726853667.66609: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.67833: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41c9040> <<< 30583 1726853667.67911: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 30583 1726853667.67938: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c41f2a20> <<< 30583 1726853667.68386: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41f27b0> <<< 30583 1726853667.68390: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41f20c0> <<< 30583 1726853667.68392: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 30583 1726853667.68394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 30583 1726853667.68398: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41f2b10> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41cbe00> <<< 30583 1726853667.68400: stdout chunk (state=3): >>>import 'atexit' # <<< 30583 1726853667.68402: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c41f37a0> <<< 30583 1726853667.68405: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c41f39e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41f3ef0> import 'pwd' # <<< 30583 1726853667.68414: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 30583 1726853667.68417: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 30583 1726853667.68419: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b15cd0> <<< 30583 1726853667.68421: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853667.68544: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b178f0> <<< 30583 1726853667.68548: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 30583 1726853667.68551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 30583 1726853667.68553: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b182f0> <<< 30583 1726853667.68557: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 30583 1726853667.68559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 30583 1726853667.68764: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b19490> <<< 30583 1726853667.68768: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b1bf80> <<< 30583 1726853667.68775: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b202c0> <<< 30583 1726853667.68777: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b1a240> <<< 30583 1726853667.68782: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 30583 1726853667.68806: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 30583 1726853667.68833: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 30583 1726853667.68894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 30583 1726853667.68916: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 30583 1726853667.68928: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b23ec0> <<< 30583 1726853667.69086: stdout chunk (state=3): >>>import '_tokenize' # <<< 30583 1726853667.69107: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b22990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b226f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 30583 1726853667.69118: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b22c60> <<< 30583 1726853667.69195: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b1a750> <<< 30583 1726853667.69201: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853667.69204: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b6bf80> <<< 30583 1726853667.69207: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b6c260> <<< 30583 1726853667.69376: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 30583 1726853667.69382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 30583 1726853667.69395: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b6dcd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b6da90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 30583 1726853667.69470: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 30583 1726853667.69523: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b70230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b6e390> <<< 30583 1726853667.69545: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 30583 1726853667.69763: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b739b0> <<< 30583 1726853667.69807: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b703b0> <<< 30583 1726853667.69984: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853667.69992: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b747a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b74bc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b74c80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b6c380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 30583 1726853667.70009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 30583 1726853667.70034: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 30583 1726853667.70066: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853667.70091: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3a00320> <<< 30583 1726853667.70245: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 30583 1726853667.70266: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3a016d0> <<< 30583 1726853667.70494: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b76ab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b77e60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b766f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853667.70535: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.70556: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 30583 1726853667.70583: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 30583 1726853667.70604: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.70776: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.70847: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.71415: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.71969: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 30583 1726853667.71987: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 30583 1726853667.72007: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 30583 1726853667.72031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 30583 1726853667.72087: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3a05970> <<< 30583 1726853667.72168: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 30583 1726853667.72187: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a06720> <<< 30583 1726853667.72203: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a017f0> <<< 30583 1726853667.72328: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 30583 1726853667.72331: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.72391: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 30583 1726853667.72462: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.72622: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 30583 1726853667.72643: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a064e0> # zipimport: zlib available <<< 30583 1726853667.73122: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.73576: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.73880: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853667.73884: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 30583 1726853667.73892: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.73894: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.74076: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 30583 1726853667.74081: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 30583 1726853667.74084: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.74086: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.74103: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 30583 1726853667.74117: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.74350: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.74675: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 30583 1726853667.74679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 30583 1726853667.74973: stdout chunk (state=3): >>>import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a07a10> # zipimport: zlib available # zipimport: zlib available <<< 30583 1726853667.74976: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 30583 1726853667.74979: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 30583 1726853667.74981: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.75015: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.75053: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 30583 1726853667.75066: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.75108: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.75148: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.75210: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.75376: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 30583 1726853667.75502: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3a12450> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a0d1c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 30583 1726853667.75578: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.75635: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.75684: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.75793: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 30583 1726853667.75844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 30583 1726853667.75994: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b02d20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c422e9f0> <<< 30583 1726853667.76059: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a125d0> <<< 30583 1726853667.76082: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a01b50> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 30583 1726853667.76106: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.76134: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 30583 1726853667.76292: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 30583 1726853667.76382: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.76586: stdout chunk (state=3): >>># zipimport: zlib available <<< 30583 1726853667.76707: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 30583 1726853667.76776: stdout chunk (state=3): >>># destroy __main__ <<< 30583 1726853667.77117: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 30583 1726853667.77140: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path <<< 30583 1726853667.77477: stdout chunk (state=3): >>># restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 30583 1726853667.77485: stdout chunk (state=3): >>># cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 30583 1726853667.77567: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 30583 1726853667.77616: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 30583 1726853667.77633: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 30583 1726853667.77653: stdout chunk (state=3): >>># destroy ntpath <<< 30583 1726853667.77684: stdout chunk (state=3): >>># destroy importlib <<< 30583 1726853667.78014: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 30583 1726853667.78197: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 30583 1726853667.78223: stdout chunk (state=3): >>># destroy _collections <<< 30583 1726853667.78246: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 30583 1726853667.78515: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools <<< 30583 1726853667.78576: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 30583 1726853667.79024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853667.79027: stdout chunk (state=3): >>><<< 30583 1726853667.79030: stderr chunk (state=3): >>><<< 30583 1726853667.79187: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c46184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c45e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c461aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c442d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c442dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c446bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c446bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4483b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4469070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4482150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44c0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c44f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44f8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c44f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4466e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44f9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44fa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4510740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c4511e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4512cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c45132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4512210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c4513d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c45134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44fa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c4297c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c42c07a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c42c0500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c42c07d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c42c1100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c42c1af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c42c09b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4295df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c42c2f00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c42c1c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c44fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c42eb230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c430f5f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4370380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4372ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c43704a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4331370> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c4175430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c430e3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c42c3e00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc7c430e750> # zipimport: found 30 names in '/tmp/ansible_stat_payload_fd7n39tz/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41cb170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41aa060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41a91c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41c9040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c41f2a20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41f27b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41f20c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41f2b10> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41cbe00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c41f37a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c41f39e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c41f3ef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b15cd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b178f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b182f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b19490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b1bf80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b202c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b1a240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b23ec0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b22990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b226f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b22c60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b1a750> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b6bf80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b6c260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b6dcd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b6da90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b70230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b6e390> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b739b0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b703b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b747a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b74bc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b74c80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b6c380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3a00320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3a016d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b76ab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3b77e60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b766f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3a05970> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a06720> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a017f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a064e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a07a10> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7c3a12450> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a0d1c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3b02d20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c422e9f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a125d0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7c3a01b50> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 30583 1726853667.80790: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853667.80794: _low_level_execute_command(): starting 30583 1726853667.80876: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853667.2444422-30699-179357836274177/ > /dev/null 2>&1 && sleep 0' 30583 1726853667.81436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853667.81439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853667.81540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853667.81567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853667.81677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853667.83719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853667.83723: stdout chunk (state=3): >>><<< 30583 1726853667.83736: stderr chunk (state=3): >>><<< 30583 1726853667.83823: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853667.83826: handler run complete 30583 1726853667.83828: attempt loop complete, returning result 30583 1726853667.83830: _execute() done 30583 1726853667.83833: dumping result to json 30583 1726853667.83930: done dumping result, returning 30583 1726853667.83933: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [02083763-bbaf-05ea-abc5-00000000002e] 30583 1726853667.83935: sending task result for task 02083763-bbaf-05ea-abc5-00000000002e ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30583 1726853667.84204: no more pending results, returning what we have 30583 1726853667.84207: results queue empty 30583 1726853667.84208: checking for any_errors_fatal 30583 1726853667.84218: done checking for any_errors_fatal 30583 1726853667.84219: checking for max_fail_percentage 30583 1726853667.84221: done checking for max_fail_percentage 30583 1726853667.84222: checking to see if all hosts have failed and the running result is not ok 30583 1726853667.84222: done checking to see if all hosts have failed 30583 1726853667.84223: getting the remaining hosts for this loop 30583 1726853667.84225: done getting the remaining hosts for this loop 30583 1726853667.84228: getting the next task for host managed_node2 30583 1726853667.84235: done getting next task for host managed_node2 30583 1726853667.84238: ^ task is: TASK: Set flag to indicate system is ostree 30583 1726853667.84241: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853667.84244: getting variables 30583 1726853667.84246: in VariableManager get_vars() 30583 1726853667.84358: Calling all_inventory to load vars for managed_node2 30583 1726853667.84362: Calling groups_inventory to load vars for managed_node2 30583 1726853667.84365: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853667.84379: Calling all_plugins_play to load vars for managed_node2 30583 1726853667.84382: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853667.84385: Calling groups_plugins_play to load vars for managed_node2 30583 1726853667.85016: done sending task result for task 02083763-bbaf-05ea-abc5-00000000002e 30583 1726853667.85019: WORKER PROCESS EXITING 30583 1726853667.85045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853667.85578: done with get_vars() 30583 1726853667.85591: done getting variables 30583 1726853667.85802: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:34:27 -0400 (0:00:00.689) 0:00:03.195 ****** 30583 1726853667.85833: entering _queue_task() for managed_node2/set_fact 30583 1726853667.85835: Creating lock for set_fact 30583 1726853667.86480: worker is 1 (out of 1 available) 30583 1726853667.86493: exiting _queue_task() for managed_node2/set_fact 30583 1726853667.86506: done queuing things up, now waiting for results queue to drain 30583 1726853667.86507: waiting for pending results... 30583 1726853667.87089: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 30583 1726853667.87198: in run() - task 02083763-bbaf-05ea-abc5-00000000002f 30583 1726853667.87303: variable 'ansible_search_path' from source: unknown 30583 1726853667.87310: variable 'ansible_search_path' from source: unknown 30583 1726853667.87350: calling self._execute() 30583 1726853667.87522: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853667.87532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853667.87545: variable 'omit' from source: magic vars 30583 1726853667.88590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853667.89252: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853667.89403: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853667.89408: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853667.89445: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853667.89649: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853667.89712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853667.89808: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853667.89948: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853667.90150: Evaluated conditional (not __network_is_ostree is defined): True 30583 1726853667.90177: variable 'omit' from source: magic vars 30583 1726853667.90219: variable 'omit' from source: magic vars 30583 1726853667.90680: variable '__ostree_booted_stat' from source: set_fact 30583 1726853667.90684: variable 'omit' from source: magic vars 30583 1726853667.90706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853667.90739: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853667.90810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853667.91005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853667.91008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853667.91010: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853667.91013: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853667.91015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853667.91177: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853667.91232: Set connection var ansible_timeout to 10 30583 1726853667.91241: Set connection var ansible_connection to ssh 30583 1726853667.91259: Set connection var ansible_shell_executable to /bin/sh 30583 1726853667.91290: Set connection var ansible_shell_type to sh 30583 1726853667.91304: Set connection var ansible_pipelining to False 30583 1726853667.91550: variable 'ansible_shell_executable' from source: unknown 30583 1726853667.91553: variable 'ansible_connection' from source: unknown 30583 1726853667.91556: variable 'ansible_module_compression' from source: unknown 30583 1726853667.91558: variable 'ansible_shell_type' from source: unknown 30583 1726853667.91560: variable 'ansible_shell_executable' from source: unknown 30583 1726853667.91562: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853667.91564: variable 'ansible_pipelining' from source: unknown 30583 1726853667.91566: variable 'ansible_timeout' from source: unknown 30583 1726853667.91568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853667.91687: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853667.91702: variable 'omit' from source: magic vars 30583 1726853667.91711: starting attempt loop 30583 1726853667.91717: running the handler 30583 1726853667.91730: handler run complete 30583 1726853667.91743: attempt loop complete, returning result 30583 1726853667.91748: _execute() done 30583 1726853667.91753: dumping result to json 30583 1726853667.91767: done dumping result, returning 30583 1726853667.91781: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-00000000002f] 30583 1726853667.91794: sending task result for task 02083763-bbaf-05ea-abc5-00000000002f 30583 1726853667.92097: done sending task result for task 02083763-bbaf-05ea-abc5-00000000002f 30583 1726853667.92100: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 30583 1726853667.92156: no more pending results, returning what we have 30583 1726853667.92159: results queue empty 30583 1726853667.92160: checking for any_errors_fatal 30583 1726853667.92166: done checking for any_errors_fatal 30583 1726853667.92167: checking for max_fail_percentage 30583 1726853667.92168: done checking for max_fail_percentage 30583 1726853667.92169: checking to see if all hosts have failed and the running result is not ok 30583 1726853667.92170: done checking to see if all hosts have failed 30583 1726853667.92173: getting the remaining hosts for this loop 30583 1726853667.92175: done getting the remaining hosts for this loop 30583 1726853667.92179: getting the next task for host managed_node2 30583 1726853667.92187: done getting next task for host managed_node2 30583 1726853667.92189: ^ task is: TASK: Fix CentOS6 Base repo 30583 1726853667.92192: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853667.92195: getting variables 30583 1726853667.92197: in VariableManager get_vars() 30583 1726853667.92227: Calling all_inventory to load vars for managed_node2 30583 1726853667.92230: Calling groups_inventory to load vars for managed_node2 30583 1726853667.92235: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853667.92245: Calling all_plugins_play to load vars for managed_node2 30583 1726853667.92247: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853667.92256: Calling groups_plugins_play to load vars for managed_node2 30583 1726853667.92586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853667.92787: done with get_vars() 30583 1726853667.92798: done getting variables 30583 1726853667.92921: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:34:27 -0400 (0:00:00.071) 0:00:03.266 ****** 30583 1726853667.92955: entering _queue_task() for managed_node2/copy 30583 1726853667.93398: worker is 1 (out of 1 available) 30583 1726853667.93409: exiting _queue_task() for managed_node2/copy 30583 1726853667.93419: done queuing things up, now waiting for results queue to drain 30583 1726853667.93421: waiting for pending results... 30583 1726853667.93598: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 30583 1726853667.93760: in run() - task 02083763-bbaf-05ea-abc5-000000000031 30583 1726853667.93764: variable 'ansible_search_path' from source: unknown 30583 1726853667.93766: variable 'ansible_search_path' from source: unknown 30583 1726853667.93775: calling self._execute() 30583 1726853667.93863: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853667.93912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853667.93915: variable 'omit' from source: magic vars 30583 1726853667.94376: variable 'ansible_distribution' from source: facts 30583 1726853667.94401: Evaluated conditional (ansible_distribution == 'CentOS'): True 30583 1726853667.94530: variable 'ansible_distribution_major_version' from source: facts 30583 1726853667.94541: Evaluated conditional (ansible_distribution_major_version == '6'): False 30583 1726853667.94564: when evaluation is False, skipping this task 30583 1726853667.94567: _execute() done 30583 1726853667.94569: dumping result to json 30583 1726853667.94572: done dumping result, returning 30583 1726853667.94675: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [02083763-bbaf-05ea-abc5-000000000031] 30583 1726853667.94679: sending task result for task 02083763-bbaf-05ea-abc5-000000000031 30583 1726853667.94746: done sending task result for task 02083763-bbaf-05ea-abc5-000000000031 30583 1726853667.94749: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 30583 1726853667.94820: no more pending results, returning what we have 30583 1726853667.94824: results queue empty 30583 1726853667.94825: checking for any_errors_fatal 30583 1726853667.94830: done checking for any_errors_fatal 30583 1726853667.94831: checking for max_fail_percentage 30583 1726853667.94833: done checking for max_fail_percentage 30583 1726853667.94834: checking to see if all hosts have failed and the running result is not ok 30583 1726853667.94834: done checking to see if all hosts have failed 30583 1726853667.94835: getting the remaining hosts for this loop 30583 1726853667.94837: done getting the remaining hosts for this loop 30583 1726853667.94841: getting the next task for host managed_node2 30583 1726853667.94847: done getting next task for host managed_node2 30583 1726853667.94850: ^ task is: TASK: Include the task 'enable_epel.yml' 30583 1726853667.94853: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853667.94858: getting variables 30583 1726853667.94860: in VariableManager get_vars() 30583 1726853667.94986: Calling all_inventory to load vars for managed_node2 30583 1726853667.94996: Calling groups_inventory to load vars for managed_node2 30583 1726853667.95000: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853667.95012: Calling all_plugins_play to load vars for managed_node2 30583 1726853667.95016: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853667.95019: Calling groups_plugins_play to load vars for managed_node2 30583 1726853667.95337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853667.95541: done with get_vars() 30583 1726853667.95552: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:34:27 -0400 (0:00:00.026) 0:00:03.293 ****** 30583 1726853667.95648: entering _queue_task() for managed_node2/include_tasks 30583 1726853667.96269: worker is 1 (out of 1 available) 30583 1726853667.96284: exiting _queue_task() for managed_node2/include_tasks 30583 1726853667.96412: done queuing things up, now waiting for results queue to drain 30583 1726853667.96414: waiting for pending results... 30583 1726853667.96964: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 30583 1726853667.96970: in run() - task 02083763-bbaf-05ea-abc5-000000000032 30583 1726853667.96975: variable 'ansible_search_path' from source: unknown 30583 1726853667.96978: variable 'ansible_search_path' from source: unknown 30583 1726853667.96980: calling self._execute() 30583 1726853667.97118: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853667.97130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853667.97143: variable 'omit' from source: magic vars 30583 1726853667.98131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853668.01269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853668.01366: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853668.01415: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853668.01455: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853668.01564: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853668.01826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853668.01830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853668.01916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853668.02120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853668.02124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853668.02367: variable '__network_is_ostree' from source: set_fact 30583 1726853668.02394: Evaluated conditional (not __network_is_ostree | d(false)): True 30583 1726853668.02404: _execute() done 30583 1726853668.02410: dumping result to json 30583 1726853668.02417: done dumping result, returning 30583 1726853668.02677: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-05ea-abc5-000000000032] 30583 1726853668.02680: sending task result for task 02083763-bbaf-05ea-abc5-000000000032 30583 1726853668.02749: done sending task result for task 02083763-bbaf-05ea-abc5-000000000032 30583 1726853668.02752: WORKER PROCESS EXITING 30583 1726853668.02785: no more pending results, returning what we have 30583 1726853668.02790: in VariableManager get_vars() 30583 1726853668.02827: Calling all_inventory to load vars for managed_node2 30583 1726853668.02831: Calling groups_inventory to load vars for managed_node2 30583 1726853668.02835: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853668.02846: Calling all_plugins_play to load vars for managed_node2 30583 1726853668.02849: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853668.02852: Calling groups_plugins_play to load vars for managed_node2 30583 1726853668.03548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853668.03899: done with get_vars() 30583 1726853668.03908: variable 'ansible_search_path' from source: unknown 30583 1726853668.03909: variable 'ansible_search_path' from source: unknown 30583 1726853668.03949: we have included files to process 30583 1726853668.03950: generating all_blocks data 30583 1726853668.03952: done generating all_blocks data 30583 1726853668.03961: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30583 1726853668.03962: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30583 1726853668.03965: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30583 1726853668.04710: done processing included file 30583 1726853668.04712: iterating over new_blocks loaded from include file 30583 1726853668.04719: in VariableManager get_vars() 30583 1726853668.04732: done with get_vars() 30583 1726853668.04733: filtering new block on tags 30583 1726853668.04757: done filtering new block on tags 30583 1726853668.04760: in VariableManager get_vars() 30583 1726853668.04773: done with get_vars() 30583 1726853668.04775: filtering new block on tags 30583 1726853668.04793: done filtering new block on tags 30583 1726853668.04795: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 30583 1726853668.04802: extending task lists for all hosts with included blocks 30583 1726853668.04916: done extending task lists 30583 1726853668.04918: done processing included files 30583 1726853668.04919: results queue empty 30583 1726853668.04919: checking for any_errors_fatal 30583 1726853668.04922: done checking for any_errors_fatal 30583 1726853668.04923: checking for max_fail_percentage 30583 1726853668.04924: done checking for max_fail_percentage 30583 1726853668.04925: checking to see if all hosts have failed and the running result is not ok 30583 1726853668.04926: done checking to see if all hosts have failed 30583 1726853668.04926: getting the remaining hosts for this loop 30583 1726853668.04927: done getting the remaining hosts for this loop 30583 1726853668.04930: getting the next task for host managed_node2 30583 1726853668.04938: done getting next task for host managed_node2 30583 1726853668.04941: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 30583 1726853668.04943: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853668.04946: getting variables 30583 1726853668.04946: in VariableManager get_vars() 30583 1726853668.04954: Calling all_inventory to load vars for managed_node2 30583 1726853668.04956: Calling groups_inventory to load vars for managed_node2 30583 1726853668.04958: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853668.04963: Calling all_plugins_play to load vars for managed_node2 30583 1726853668.04974: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853668.04978: Calling groups_plugins_play to load vars for managed_node2 30583 1726853668.05282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853668.05484: done with get_vars() 30583 1726853668.05492: done getting variables 30583 1726853668.05558: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 30583 1726853668.05746: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:34:28 -0400 (0:00:00.101) 0:00:03.395 ****** 30583 1726853668.05800: entering _queue_task() for managed_node2/command 30583 1726853668.05802: Creating lock for command 30583 1726853668.06205: worker is 1 (out of 1 available) 30583 1726853668.06216: exiting _queue_task() for managed_node2/command 30583 1726853668.06226: done queuing things up, now waiting for results queue to drain 30583 1726853668.06227: waiting for pending results... 30583 1726853668.06535: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 30583 1726853668.06550: in run() - task 02083763-bbaf-05ea-abc5-00000000004c 30583 1726853668.06574: variable 'ansible_search_path' from source: unknown 30583 1726853668.06581: variable 'ansible_search_path' from source: unknown 30583 1726853668.06619: calling self._execute() 30583 1726853668.06702: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853668.06722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853668.06741: variable 'omit' from source: magic vars 30583 1726853668.07152: variable 'ansible_distribution' from source: facts 30583 1726853668.07173: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30583 1726853668.07311: variable 'ansible_distribution_major_version' from source: facts 30583 1726853668.07327: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30583 1726853668.07334: when evaluation is False, skipping this task 30583 1726853668.07376: _execute() done 30583 1726853668.07379: dumping result to json 30583 1726853668.07382: done dumping result, returning 30583 1726853668.07385: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [02083763-bbaf-05ea-abc5-00000000004c] 30583 1726853668.07387: sending task result for task 02083763-bbaf-05ea-abc5-00000000004c 30583 1726853668.07506: done sending task result for task 02083763-bbaf-05ea-abc5-00000000004c 30583 1726853668.07509: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30583 1726853668.07710: no more pending results, returning what we have 30583 1726853668.07714: results queue empty 30583 1726853668.07715: checking for any_errors_fatal 30583 1726853668.07716: done checking for any_errors_fatal 30583 1726853668.07717: checking for max_fail_percentage 30583 1726853668.07719: done checking for max_fail_percentage 30583 1726853668.07719: checking to see if all hosts have failed and the running result is not ok 30583 1726853668.07720: done checking to see if all hosts have failed 30583 1726853668.07721: getting the remaining hosts for this loop 30583 1726853668.07723: done getting the remaining hosts for this loop 30583 1726853668.07726: getting the next task for host managed_node2 30583 1726853668.07733: done getting next task for host managed_node2 30583 1726853668.07736: ^ task is: TASK: Install yum-utils package 30583 1726853668.07740: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853668.07744: getting variables 30583 1726853668.07746: in VariableManager get_vars() 30583 1726853668.07778: Calling all_inventory to load vars for managed_node2 30583 1726853668.07781: Calling groups_inventory to load vars for managed_node2 30583 1726853668.07785: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853668.07798: Calling all_plugins_play to load vars for managed_node2 30583 1726853668.07802: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853668.07805: Calling groups_plugins_play to load vars for managed_node2 30583 1726853668.08185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853668.08401: done with get_vars() 30583 1726853668.08411: done getting variables 30583 1726853668.08516: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:34:28 -0400 (0:00:00.027) 0:00:03.422 ****** 30583 1726853668.08548: entering _queue_task() for managed_node2/package 30583 1726853668.08550: Creating lock for package 30583 1726853668.08848: worker is 1 (out of 1 available) 30583 1726853668.08976: exiting _queue_task() for managed_node2/package 30583 1726853668.08987: done queuing things up, now waiting for results queue to drain 30583 1726853668.08989: waiting for pending results... 30583 1726853668.09310: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 30583 1726853668.09314: in run() - task 02083763-bbaf-05ea-abc5-00000000004d 30583 1726853668.09317: variable 'ansible_search_path' from source: unknown 30583 1726853668.09320: variable 'ansible_search_path' from source: unknown 30583 1726853668.09338: calling self._execute() 30583 1726853668.09494: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853668.09497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853668.09500: variable 'omit' from source: magic vars 30583 1726853668.09930: variable 'ansible_distribution' from source: facts 30583 1726853668.09959: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30583 1726853668.10102: variable 'ansible_distribution_major_version' from source: facts 30583 1726853668.10115: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30583 1726853668.10123: when evaluation is False, skipping this task 30583 1726853668.10131: _execute() done 30583 1726853668.10138: dumping result to json 30583 1726853668.10169: done dumping result, returning 30583 1726853668.10175: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [02083763-bbaf-05ea-abc5-00000000004d] 30583 1726853668.10177: sending task result for task 02083763-bbaf-05ea-abc5-00000000004d skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30583 1726853668.10448: no more pending results, returning what we have 30583 1726853668.10451: results queue empty 30583 1726853668.10452: checking for any_errors_fatal 30583 1726853668.10461: done checking for any_errors_fatal 30583 1726853668.10462: checking for max_fail_percentage 30583 1726853668.10464: done checking for max_fail_percentage 30583 1726853668.10465: checking to see if all hosts have failed and the running result is not ok 30583 1726853668.10465: done checking to see if all hosts have failed 30583 1726853668.10466: getting the remaining hosts for this loop 30583 1726853668.10468: done getting the remaining hosts for this loop 30583 1726853668.10473: getting the next task for host managed_node2 30583 1726853668.10480: done getting next task for host managed_node2 30583 1726853668.10483: ^ task is: TASK: Enable EPEL 7 30583 1726853668.10494: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853668.10498: getting variables 30583 1726853668.10500: in VariableManager get_vars() 30583 1726853668.10531: Calling all_inventory to load vars for managed_node2 30583 1726853668.10534: Calling groups_inventory to load vars for managed_node2 30583 1726853668.10538: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853668.10608: done sending task result for task 02083763-bbaf-05ea-abc5-00000000004d 30583 1726853668.10612: WORKER PROCESS EXITING 30583 1726853668.10622: Calling all_plugins_play to load vars for managed_node2 30583 1726853668.10626: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853668.10629: Calling groups_plugins_play to load vars for managed_node2 30583 1726853668.10940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853668.11152: done with get_vars() 30583 1726853668.11196: done getting variables 30583 1726853668.11319: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:34:28 -0400 (0:00:00.027) 0:00:03.450 ****** 30583 1726853668.11354: entering _queue_task() for managed_node2/command 30583 1726853668.11917: worker is 1 (out of 1 available) 30583 1726853668.11927: exiting _queue_task() for managed_node2/command 30583 1726853668.11939: done queuing things up, now waiting for results queue to drain 30583 1726853668.11940: waiting for pending results... 30583 1726853668.12347: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 30583 1726853668.12505: in run() - task 02083763-bbaf-05ea-abc5-00000000004e 30583 1726853668.12509: variable 'ansible_search_path' from source: unknown 30583 1726853668.12511: variable 'ansible_search_path' from source: unknown 30583 1726853668.12592: calling self._execute() 30583 1726853668.12660: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853668.12666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853668.12777: variable 'omit' from source: magic vars 30583 1726853668.13141: variable 'ansible_distribution' from source: facts 30583 1726853668.13160: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30583 1726853668.13288: variable 'ansible_distribution_major_version' from source: facts 30583 1726853668.13302: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30583 1726853668.13311: when evaluation is False, skipping this task 30583 1726853668.13319: _execute() done 30583 1726853668.13332: dumping result to json 30583 1726853668.13341: done dumping result, returning 30583 1726853668.13351: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [02083763-bbaf-05ea-abc5-00000000004e] 30583 1726853668.13387: sending task result for task 02083763-bbaf-05ea-abc5-00000000004e skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30583 1726853668.13572: no more pending results, returning what we have 30583 1726853668.13576: results queue empty 30583 1726853668.13577: checking for any_errors_fatal 30583 1726853668.13584: done checking for any_errors_fatal 30583 1726853668.13585: checking for max_fail_percentage 30583 1726853668.13587: done checking for max_fail_percentage 30583 1726853668.13588: checking to see if all hosts have failed and the running result is not ok 30583 1726853668.13588: done checking to see if all hosts have failed 30583 1726853668.13598: getting the remaining hosts for this loop 30583 1726853668.13600: done getting the remaining hosts for this loop 30583 1726853668.13604: getting the next task for host managed_node2 30583 1726853668.13612: done getting next task for host managed_node2 30583 1726853668.13615: ^ task is: TASK: Enable EPEL 8 30583 1726853668.13677: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853668.13687: getting variables 30583 1726853668.13694: in VariableManager get_vars() 30583 1726853668.13725: Calling all_inventory to load vars for managed_node2 30583 1726853668.13843: Calling groups_inventory to load vars for managed_node2 30583 1726853668.13849: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853668.13863: Calling all_plugins_play to load vars for managed_node2 30583 1726853668.13867: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853668.13870: Calling groups_plugins_play to load vars for managed_node2 30583 1726853668.14351: done sending task result for task 02083763-bbaf-05ea-abc5-00000000004e 30583 1726853668.14357: WORKER PROCESS EXITING 30583 1726853668.14373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853668.14839: done with get_vars() 30583 1726853668.14914: done getting variables 30583 1726853668.15019: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:34:28 -0400 (0:00:00.037) 0:00:03.487 ****** 30583 1726853668.15051: entering _queue_task() for managed_node2/command 30583 1726853668.15368: worker is 1 (out of 1 available) 30583 1726853668.15384: exiting _queue_task() for managed_node2/command 30583 1726853668.15397: done queuing things up, now waiting for results queue to drain 30583 1726853668.15398: waiting for pending results... 30583 1726853668.15621: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 30583 1726853668.15735: in run() - task 02083763-bbaf-05ea-abc5-00000000004f 30583 1726853668.15753: variable 'ansible_search_path' from source: unknown 30583 1726853668.15765: variable 'ansible_search_path' from source: unknown 30583 1726853668.15810: calling self._execute() 30583 1726853668.15890: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853668.15907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853668.15921: variable 'omit' from source: magic vars 30583 1726853668.16314: variable 'ansible_distribution' from source: facts 30583 1726853668.16335: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30583 1726853668.16472: variable 'ansible_distribution_major_version' from source: facts 30583 1726853668.16484: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30583 1726853668.16492: when evaluation is False, skipping this task 30583 1726853668.16500: _execute() done 30583 1726853668.16506: dumping result to json 30583 1726853668.16514: done dumping result, returning 30583 1726853668.16525: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [02083763-bbaf-05ea-abc5-00000000004f] 30583 1726853668.16534: sending task result for task 02083763-bbaf-05ea-abc5-00000000004f skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30583 1726853668.16875: no more pending results, returning what we have 30583 1726853668.16878: results queue empty 30583 1726853668.16879: checking for any_errors_fatal 30583 1726853668.16883: done checking for any_errors_fatal 30583 1726853668.16884: checking for max_fail_percentage 30583 1726853668.16886: done checking for max_fail_percentage 30583 1726853668.16886: checking to see if all hosts have failed and the running result is not ok 30583 1726853668.16887: done checking to see if all hosts have failed 30583 1726853668.16888: getting the remaining hosts for this loop 30583 1726853668.16889: done getting the remaining hosts for this loop 30583 1726853668.16892: getting the next task for host managed_node2 30583 1726853668.16901: done getting next task for host managed_node2 30583 1726853668.16903: ^ task is: TASK: Enable EPEL 6 30583 1726853668.16907: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853668.16910: getting variables 30583 1726853668.16912: in VariableManager get_vars() 30583 1726853668.16937: Calling all_inventory to load vars for managed_node2 30583 1726853668.16939: Calling groups_inventory to load vars for managed_node2 30583 1726853668.16943: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853668.16952: Calling all_plugins_play to load vars for managed_node2 30583 1726853668.16958: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853668.16962: Calling groups_plugins_play to load vars for managed_node2 30583 1726853668.17378: done sending task result for task 02083763-bbaf-05ea-abc5-00000000004f 30583 1726853668.17381: WORKER PROCESS EXITING 30583 1726853668.17403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853668.17654: done with get_vars() 30583 1726853668.17668: done getting variables 30583 1726853668.17816: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:34:28 -0400 (0:00:00.027) 0:00:03.515 ****** 30583 1726853668.17847: entering _queue_task() for managed_node2/copy 30583 1726853668.18469: worker is 1 (out of 1 available) 30583 1726853668.18482: exiting _queue_task() for managed_node2/copy 30583 1726853668.18501: done queuing things up, now waiting for results queue to drain 30583 1726853668.18502: waiting for pending results... 30583 1726853668.18636: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 30583 1726853668.18814: in run() - task 02083763-bbaf-05ea-abc5-000000000051 30583 1726853668.18844: variable 'ansible_search_path' from source: unknown 30583 1726853668.18852: variable 'ansible_search_path' from source: unknown 30583 1726853668.18897: calling self._execute() 30583 1726853668.18966: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853668.18969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853668.18996: variable 'omit' from source: magic vars 30583 1726853668.19321: variable 'ansible_distribution' from source: facts 30583 1726853668.19326: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30583 1726853668.19403: variable 'ansible_distribution_major_version' from source: facts 30583 1726853668.19407: Evaluated conditional (ansible_distribution_major_version == '6'): False 30583 1726853668.19410: when evaluation is False, skipping this task 30583 1726853668.19415: _execute() done 30583 1726853668.19418: dumping result to json 30583 1726853668.19422: done dumping result, returning 30583 1726853668.19430: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [02083763-bbaf-05ea-abc5-000000000051] 30583 1726853668.19435: sending task result for task 02083763-bbaf-05ea-abc5-000000000051 30583 1726853668.19519: done sending task result for task 02083763-bbaf-05ea-abc5-000000000051 30583 1726853668.19521: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 30583 1726853668.19579: no more pending results, returning what we have 30583 1726853668.19583: results queue empty 30583 1726853668.19583: checking for any_errors_fatal 30583 1726853668.19589: done checking for any_errors_fatal 30583 1726853668.19590: checking for max_fail_percentage 30583 1726853668.19592: done checking for max_fail_percentage 30583 1726853668.19593: checking to see if all hosts have failed and the running result is not ok 30583 1726853668.19594: done checking to see if all hosts have failed 30583 1726853668.19594: getting the remaining hosts for this loop 30583 1726853668.19596: done getting the remaining hosts for this loop 30583 1726853668.19600: getting the next task for host managed_node2 30583 1726853668.19608: done getting next task for host managed_node2 30583 1726853668.19610: ^ task is: TASK: Set network provider to 'nm' 30583 1726853668.19612: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853668.19616: getting variables 30583 1726853668.19618: in VariableManager get_vars() 30583 1726853668.19643: Calling all_inventory to load vars for managed_node2 30583 1726853668.19646: Calling groups_inventory to load vars for managed_node2 30583 1726853668.19648: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853668.19659: Calling all_plugins_play to load vars for managed_node2 30583 1726853668.19662: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853668.19664: Calling groups_plugins_play to load vars for managed_node2 30583 1726853668.19807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853668.19924: done with get_vars() 30583 1726853668.19934: done getting variables 30583 1726853668.19976: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:13 Friday 20 September 2024 13:34:28 -0400 (0:00:00.021) 0:00:03.537 ****** 30583 1726853668.19994: entering _queue_task() for managed_node2/set_fact 30583 1726853668.20224: worker is 1 (out of 1 available) 30583 1726853668.20235: exiting _queue_task() for managed_node2/set_fact 30583 1726853668.20247: done queuing things up, now waiting for results queue to drain 30583 1726853668.20248: waiting for pending results... 30583 1726853668.20695: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 30583 1726853668.20699: in run() - task 02083763-bbaf-05ea-abc5-000000000007 30583 1726853668.20701: variable 'ansible_search_path' from source: unknown 30583 1726853668.20704: calling self._execute() 30583 1726853668.20717: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853668.20728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853668.20741: variable 'omit' from source: magic vars 30583 1726853668.20852: variable 'omit' from source: magic vars 30583 1726853668.20890: variable 'omit' from source: magic vars 30583 1726853668.20939: variable 'omit' from source: magic vars 30583 1726853668.20987: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853668.21043: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853668.21069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853668.21092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853668.21122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853668.21174: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853668.21177: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853668.21180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853668.21337: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853668.21339: Set connection var ansible_timeout to 10 30583 1726853668.21341: Set connection var ansible_connection to ssh 30583 1726853668.21342: Set connection var ansible_shell_executable to /bin/sh 30583 1726853668.21343: Set connection var ansible_shell_type to sh 30583 1726853668.21345: Set connection var ansible_pipelining to False 30583 1726853668.21346: variable 'ansible_shell_executable' from source: unknown 30583 1726853668.21347: variable 'ansible_connection' from source: unknown 30583 1726853668.21348: variable 'ansible_module_compression' from source: unknown 30583 1726853668.21349: variable 'ansible_shell_type' from source: unknown 30583 1726853668.21350: variable 'ansible_shell_executable' from source: unknown 30583 1726853668.21352: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853668.21353: variable 'ansible_pipelining' from source: unknown 30583 1726853668.21359: variable 'ansible_timeout' from source: unknown 30583 1726853668.21363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853668.21480: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853668.21491: variable 'omit' from source: magic vars 30583 1726853668.21498: starting attempt loop 30583 1726853668.21502: running the handler 30583 1726853668.21552: handler run complete 30583 1726853668.21554: attempt loop complete, returning result 30583 1726853668.21556: _execute() done 30583 1726853668.21558: dumping result to json 30583 1726853668.21559: done dumping result, returning 30583 1726853668.21560: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [02083763-bbaf-05ea-abc5-000000000007] 30583 1726853668.21562: sending task result for task 02083763-bbaf-05ea-abc5-000000000007 ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 30583 1726853668.21706: no more pending results, returning what we have 30583 1726853668.21709: results queue empty 30583 1726853668.21710: checking for any_errors_fatal 30583 1726853668.21715: done checking for any_errors_fatal 30583 1726853668.21716: checking for max_fail_percentage 30583 1726853668.21718: done checking for max_fail_percentage 30583 1726853668.21718: checking to see if all hosts have failed and the running result is not ok 30583 1726853668.21719: done checking to see if all hosts have failed 30583 1726853668.21719: getting the remaining hosts for this loop 30583 1726853668.21721: done getting the remaining hosts for this loop 30583 1726853668.21725: getting the next task for host managed_node2 30583 1726853668.21732: done getting next task for host managed_node2 30583 1726853668.21734: ^ task is: TASK: meta (flush_handlers) 30583 1726853668.21736: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853668.21740: getting variables 30583 1726853668.21741: in VariableManager get_vars() 30583 1726853668.21769: Calling all_inventory to load vars for managed_node2 30583 1726853668.21773: Calling groups_inventory to load vars for managed_node2 30583 1726853668.21776: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853668.21785: Calling all_plugins_play to load vars for managed_node2 30583 1726853668.21787: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853668.21790: Calling groups_plugins_play to load vars for managed_node2 30583 1726853668.21916: done sending task result for task 02083763-bbaf-05ea-abc5-000000000007 30583 1726853668.21919: WORKER PROCESS EXITING 30583 1726853668.21929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853668.22058: done with get_vars() 30583 1726853668.22065: done getting variables 30583 1726853668.22112: in VariableManager get_vars() 30583 1726853668.22118: Calling all_inventory to load vars for managed_node2 30583 1726853668.22120: Calling groups_inventory to load vars for managed_node2 30583 1726853668.22121: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853668.22124: Calling all_plugins_play to load vars for managed_node2 30583 1726853668.22125: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853668.22127: Calling groups_plugins_play to load vars for managed_node2 30583 1726853668.22213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853668.22320: done with get_vars() 30583 1726853668.22329: done queuing things up, now waiting for results queue to drain 30583 1726853668.22330: results queue empty 30583 1726853668.22331: checking for any_errors_fatal 30583 1726853668.22332: done checking for any_errors_fatal 30583 1726853668.22333: checking for max_fail_percentage 30583 1726853668.22334: done checking for max_fail_percentage 30583 1726853668.22334: checking to see if all hosts have failed and the running result is not ok 30583 1726853668.22334: done checking to see if all hosts have failed 30583 1726853668.22335: getting the remaining hosts for this loop 30583 1726853668.22335: done getting the remaining hosts for this loop 30583 1726853668.22337: getting the next task for host managed_node2 30583 1726853668.22339: done getting next task for host managed_node2 30583 1726853668.22340: ^ task is: TASK: meta (flush_handlers) 30583 1726853668.22341: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853668.22347: getting variables 30583 1726853668.22347: in VariableManager get_vars() 30583 1726853668.22352: Calling all_inventory to load vars for managed_node2 30583 1726853668.22353: Calling groups_inventory to load vars for managed_node2 30583 1726853668.22357: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853668.22360: Calling all_plugins_play to load vars for managed_node2 30583 1726853668.22361: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853668.22363: Calling groups_plugins_play to load vars for managed_node2 30583 1726853668.22443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853668.22565: done with get_vars() 30583 1726853668.22570: done getting variables 30583 1726853668.22600: in VariableManager get_vars() 30583 1726853668.22605: Calling all_inventory to load vars for managed_node2 30583 1726853668.22606: Calling groups_inventory to load vars for managed_node2 30583 1726853668.22607: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853668.22610: Calling all_plugins_play to load vars for managed_node2 30583 1726853668.22611: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853668.22614: Calling groups_plugins_play to load vars for managed_node2 30583 1726853668.22692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853668.22795: done with get_vars() 30583 1726853668.22802: done queuing things up, now waiting for results queue to drain 30583 1726853668.22804: results queue empty 30583 1726853668.22804: checking for any_errors_fatal 30583 1726853668.22805: done checking for any_errors_fatal 30583 1726853668.22805: checking for max_fail_percentage 30583 1726853668.22806: done checking for max_fail_percentage 30583 1726853668.22806: checking to see if all hosts have failed and the running result is not ok 30583 1726853668.22807: done checking to see if all hosts have failed 30583 1726853668.22807: getting the remaining hosts for this loop 30583 1726853668.22808: done getting the remaining hosts for this loop 30583 1726853668.22809: getting the next task for host managed_node2 30583 1726853668.22811: done getting next task for host managed_node2 30583 1726853668.22811: ^ task is: None 30583 1726853668.22812: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853668.22813: done queuing things up, now waiting for results queue to drain 30583 1726853668.22813: results queue empty 30583 1726853668.22814: checking for any_errors_fatal 30583 1726853668.22814: done checking for any_errors_fatal 30583 1726853668.22815: checking for max_fail_percentage 30583 1726853668.22815: done checking for max_fail_percentage 30583 1726853668.22816: checking to see if all hosts have failed and the running result is not ok 30583 1726853668.22816: done checking to see if all hosts have failed 30583 1726853668.22817: getting the next task for host managed_node2 30583 1726853668.22818: done getting next task for host managed_node2 30583 1726853668.22819: ^ task is: None 30583 1726853668.22820: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853668.22855: in VariableManager get_vars() 30583 1726853668.22866: done with get_vars() 30583 1726853668.22870: in VariableManager get_vars() 30583 1726853668.22878: done with get_vars() 30583 1726853668.22881: variable 'omit' from source: magic vars 30583 1726853668.22900: in VariableManager get_vars() 30583 1726853668.22907: done with get_vars() 30583 1726853668.22920: variable 'omit' from source: magic vars PLAY [Play for testing states] ************************************************* 30583 1726853668.23131: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 30583 1726853668.23160: getting the remaining hosts for this loop 30583 1726853668.23161: done getting the remaining hosts for this loop 30583 1726853668.23163: getting the next task for host managed_node2 30583 1726853668.23166: done getting next task for host managed_node2 30583 1726853668.23168: ^ task is: TASK: Gathering Facts 30583 1726853668.23169: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853668.23174: getting variables 30583 1726853668.23175: in VariableManager get_vars() 30583 1726853668.23187: Calling all_inventory to load vars for managed_node2 30583 1726853668.23188: Calling groups_inventory to load vars for managed_node2 30583 1726853668.23190: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853668.23193: Calling all_plugins_play to load vars for managed_node2 30583 1726853668.23201: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853668.23203: Calling groups_plugins_play to load vars for managed_node2 30583 1726853668.23335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853668.23518: done with get_vars() 30583 1726853668.23526: done getting variables 30583 1726853668.23559: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:3 Friday 20 September 2024 13:34:28 -0400 (0:00:00.035) 0:00:03.573 ****** 30583 1726853668.23583: entering _queue_task() for managed_node2/gather_facts 30583 1726853668.23862: worker is 1 (out of 1 available) 30583 1726853668.23875: exiting _queue_task() for managed_node2/gather_facts 30583 1726853668.23886: done queuing things up, now waiting for results queue to drain 30583 1726853668.23888: waiting for pending results... 30583 1726853668.24288: running TaskExecutor() for managed_node2/TASK: Gathering Facts 30583 1726853668.24292: in run() - task 02083763-bbaf-05ea-abc5-000000000077 30583 1726853668.24295: variable 'ansible_search_path' from source: unknown 30583 1726853668.24337: calling self._execute() 30583 1726853668.24415: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853668.24425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853668.24437: variable 'omit' from source: magic vars 30583 1726853668.24809: variable 'ansible_distribution_major_version' from source: facts 30583 1726853668.24829: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853668.24833: variable 'omit' from source: magic vars 30583 1726853668.24852: variable 'omit' from source: magic vars 30583 1726853668.24886: variable 'omit' from source: magic vars 30583 1726853668.24926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853668.24952: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853668.24969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853668.24983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853668.24992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853668.25014: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853668.25016: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853668.25019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853668.25092: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853668.25096: Set connection var ansible_timeout to 10 30583 1726853668.25098: Set connection var ansible_connection to ssh 30583 1726853668.25104: Set connection var ansible_shell_executable to /bin/sh 30583 1726853668.25106: Set connection var ansible_shell_type to sh 30583 1726853668.25113: Set connection var ansible_pipelining to False 30583 1726853668.25131: variable 'ansible_shell_executable' from source: unknown 30583 1726853668.25134: variable 'ansible_connection' from source: unknown 30583 1726853668.25136: variable 'ansible_module_compression' from source: unknown 30583 1726853668.25141: variable 'ansible_shell_type' from source: unknown 30583 1726853668.25143: variable 'ansible_shell_executable' from source: unknown 30583 1726853668.25146: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853668.25148: variable 'ansible_pipelining' from source: unknown 30583 1726853668.25150: variable 'ansible_timeout' from source: unknown 30583 1726853668.25152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853668.25284: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853668.25292: variable 'omit' from source: magic vars 30583 1726853668.25297: starting attempt loop 30583 1726853668.25300: running the handler 30583 1726853668.25312: variable 'ansible_facts' from source: unknown 30583 1726853668.25327: _low_level_execute_command(): starting 30583 1726853668.25333: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853668.25835: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853668.25840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853668.25844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853668.25891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853668.25894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853668.25989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853668.28430: stdout chunk (state=3): >>>/root <<< 30583 1726853668.28585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853668.28617: stderr chunk (state=3): >>><<< 30583 1726853668.28619: stdout chunk (state=3): >>><<< 30583 1726853668.28649: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30583 1726853668.28653: _low_level_execute_command(): starting 30583 1726853668.28656: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055 `" && echo ansible-tmp-1726853668.2863748-30754-237153369356055="` echo /root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055 `" ) && sleep 0' 30583 1726853668.29084: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853668.29088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853668.29090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853668.29102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853668.29139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853668.29164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853668.29396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853668.32251: stdout chunk (state=3): >>>ansible-tmp-1726853668.2863748-30754-237153369356055=/root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055 <<< 30583 1726853668.32433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853668.32460: stderr chunk (state=3): >>><<< 30583 1726853668.32463: stdout chunk (state=3): >>><<< 30583 1726853668.32521: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853668.2863748-30754-237153369356055=/root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30583 1726853668.32530: variable 'ansible_module_compression' from source: unknown 30583 1726853668.32576: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30583 1726853668.32626: variable 'ansible_facts' from source: unknown 30583 1726853668.32753: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055/AnsiballZ_setup.py 30583 1726853668.32857: Sending initial data 30583 1726853668.32861: Sent initial data (154 bytes) 30583 1726853668.33276: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853668.33280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853668.33288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853668.33304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853668.33355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853668.33359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853668.33364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853668.33442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853668.35806: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853668.35887: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853668.35958: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpnd833rng /root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055/AnsiballZ_setup.py <<< 30583 1726853668.35961: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055/AnsiballZ_setup.py" <<< 30583 1726853668.36030: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpnd833rng" to remote "/root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055/AnsiballZ_setup.py" <<< 30583 1726853668.36036: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055/AnsiballZ_setup.py" <<< 30583 1726853668.37234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853668.37280: stderr chunk (state=3): >>><<< 30583 1726853668.37283: stdout chunk (state=3): >>><<< 30583 1726853668.37304: done transferring module to remote 30583 1726853668.37314: _low_level_execute_command(): starting 30583 1726853668.37318: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055/ /root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055/AnsiballZ_setup.py && sleep 0' 30583 1726853668.37767: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853668.37770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853668.37775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853668.37777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853668.37779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853668.37825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853668.37838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853668.37906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853668.40518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853668.40548: stderr chunk (state=3): >>><<< 30583 1726853668.40551: stdout chunk (state=3): >>><<< 30583 1726853668.40565: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30583 1726853668.40568: _low_level_execute_command(): starting 30583 1726853668.40573: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055/AnsiballZ_setup.py && sleep 0' 30583 1726853668.41022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853668.41025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853668.41028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853668.41030: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853668.41033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853668.41084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853668.41101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853668.41177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853669.25765: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-197", "ansible_nodename": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2134955d8b5184190900489dab957f", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2944, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 587, "free": 2944}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_uuid": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 879, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789663232, "block_size": 4096, "block_total": 65519099, "block_available": 63913492, "block_used": 1605607, "inode_total": 131070960, "inode_available": 131029060, "inode_used": 41900, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDiy4Yen7eiWP0/hmH4/5WHzI91c8NPRAJCku4Kk63/nAM2/HDHVpCGbs8kPnAcpJ95BGnY2AZ50i/GjByh6rqN4q0QNajZqOQdMrkomTRQGFsaoQTUzu+Wt7NYtajPseEV2zJTYbIlIC8H5nwTib7SkZscdc1iTw0saFFpV/aB+l5BDLfOe5EeE772aMDPUwKIw9RVy45e9Dl7uEv/Ez5XL/ZsZ8K0iZ4v2/Ebj39j+tw5M9hEjzRp4dqgv4FTXaFf2TvCql8dulUOPsjMu2MIvIfB4FbPNXrGKPKbzkjxWn4r+wUuvMPr4zoIJieVXFTR6ozZdzis6d3WFGAgZgX3ns+ULgR+lp0ZvHZb2amOGE8aM1TdwnDCeanweLvXk4zxXrpg0T4bTmQwKkDtd0DFml2CkWe4615TK07c49NoApmnEgPdztwxtraghMO72UOZkRBgUDB5GKSc202pCChA/GqiwfaUPdjS4LyUdkhgYAUniLPI2FRsZg4+EpoMZgs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa"<<< 30583 1726853669.25811: stdout chunk (state=3): >>>, "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAviMKS0iYCdMhDNjaRFlzVurOd6RVFe0VKYVOOZJko3KaULgIYAaS/l/1rRBz1963986hrDhKrLwmMRxr85S4Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAGtlq4ktcSkdXJkETJjSEIO/6xbcTDcVVefyj1D7mpG", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 60520 10.31.9.197 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 60520 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.736328125, "5m": 0.63525390625, "15m": 0.36962890625}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "29", "epoch": "1726853669", "epoch_int": "1726853669", "date": "2024-09-20", "time": "13:34:29", "iso8601_micro": "2024-09-20T17:34:29.190560Z", "iso8601": "2024-09-20T17:34:29Z", "iso8601_basic": "20240920T133429190560", "iso8601_basic_short": "20240920T133429", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10bc:daff:fe29:a445", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_g<<< 30583 1726853669.25840: stdout chunk (state=3): >>>so_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.197"], "ansible_all_ipv6_addresses": ["fe80::10bc:daff:fe29:a445"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.197", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10bc:daff:fe29:a445"]}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30583 1726853669.29085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853669.29089: stdout chunk (state=3): >>><<< 30583 1726853669.29092: stderr chunk (state=3): >>><<< 30583 1726853669.29278: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-197", "ansible_nodename": "ip-10-31-9-197.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2134955d8b5184190900489dab957f", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2944, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 587, "free": 2944}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_uuid": "ec213495-5d8b-5184-1909-00489dab957f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 879, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789663232, "block_size": 4096, "block_total": 65519099, "block_available": 63913492, "block_used": 1605607, "inode_total": 131070960, "inode_available": 131029060, "inode_used": 41900, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDiy4Yen7eiWP0/hmH4/5WHzI91c8NPRAJCku4Kk63/nAM2/HDHVpCGbs8kPnAcpJ95BGnY2AZ50i/GjByh6rqN4q0QNajZqOQdMrkomTRQGFsaoQTUzu+Wt7NYtajPseEV2zJTYbIlIC8H5nwTib7SkZscdc1iTw0saFFpV/aB+l5BDLfOe5EeE772aMDPUwKIw9RVy45e9Dl7uEv/Ez5XL/ZsZ8K0iZ4v2/Ebj39j+tw5M9hEjzRp4dqgv4FTXaFf2TvCql8dulUOPsjMu2MIvIfB4FbPNXrGKPKbzkjxWn4r+wUuvMPr4zoIJieVXFTR6ozZdzis6d3WFGAgZgX3ns+ULgR+lp0ZvHZb2amOGE8aM1TdwnDCeanweLvXk4zxXrpg0T4bTmQwKkDtd0DFml2CkWe4615TK07c49NoApmnEgPdztwxtraghMO72UOZkRBgUDB5GKSc202pCChA/GqiwfaUPdjS4LyUdkhgYAUniLPI2FRsZg4+EpoMZgs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAviMKS0iYCdMhDNjaRFlzVurOd6RVFe0VKYVOOZJko3KaULgIYAaS/l/1rRBz1963986hrDhKrLwmMRxr85S4Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAGtlq4ktcSkdXJkETJjSEIO/6xbcTDcVVefyj1D7mpG", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 60520 10.31.9.197 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 60520 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.736328125, "5m": 0.63525390625, "15m": 0.36962890625}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "29", "epoch": "1726853669", "epoch_int": "1726853669", "date": "2024-09-20", "time": "13:34:29", "iso8601_micro": "2024-09-20T17:34:29.190560Z", "iso8601": "2024-09-20T17:34:29Z", "iso8601_basic": "20240920T133429190560", "iso8601_basic_short": "20240920T133429", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10bc:daff:fe29:a445", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.197", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:bc:da:29:a4:45", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.197"], "ansible_all_ipv6_addresses": ["fe80::10bc:daff:fe29:a445"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.197", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10bc:daff:fe29:a445"]}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853669.29536: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853669.29566: _low_level_execute_command(): starting 30583 1726853669.29580: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853668.2863748-30754-237153369356055/ > /dev/null 2>&1 && sleep 0' 30583 1726853669.30282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853669.30299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853669.30394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853669.30439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853669.30591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853669.33341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853669.33577: stdout chunk (state=3): >>><<< 30583 1726853669.33581: stderr chunk (state=3): >>><<< 30583 1726853669.33584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30583 1726853669.33586: handler run complete 30583 1726853669.33740: variable 'ansible_facts' from source: unknown 30583 1726853669.34005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.34740: variable 'ansible_facts' from source: unknown 30583 1726853669.34886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.35043: attempt loop complete, returning result 30583 1726853669.35054: _execute() done 30583 1726853669.35066: dumping result to json 30583 1726853669.35117: done dumping result, returning 30583 1726853669.35132: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [02083763-bbaf-05ea-abc5-000000000077] 30583 1726853669.35143: sending task result for task 02083763-bbaf-05ea-abc5-000000000077 ok: [managed_node2] 30583 1726853669.36084: no more pending results, returning what we have 30583 1726853669.36087: results queue empty 30583 1726853669.36088: checking for any_errors_fatal 30583 1726853669.36090: done checking for any_errors_fatal 30583 1726853669.36090: checking for max_fail_percentage 30583 1726853669.36092: done checking for max_fail_percentage 30583 1726853669.36093: checking to see if all hosts have failed and the running result is not ok 30583 1726853669.36093: done checking to see if all hosts have failed 30583 1726853669.36094: getting the remaining hosts for this loop 30583 1726853669.36096: done getting the remaining hosts for this loop 30583 1726853669.36110: getting the next task for host managed_node2 30583 1726853669.36116: done getting next task for host managed_node2 30583 1726853669.36118: ^ task is: TASK: meta (flush_handlers) 30583 1726853669.36120: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853669.36124: getting variables 30583 1726853669.36125: in VariableManager get_vars() 30583 1726853669.36148: Calling all_inventory to load vars for managed_node2 30583 1726853669.36151: Calling groups_inventory to load vars for managed_node2 30583 1726853669.36154: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853669.36221: done sending task result for task 02083763-bbaf-05ea-abc5-000000000077 30583 1726853669.36225: WORKER PROCESS EXITING 30583 1726853669.36236: Calling all_plugins_play to load vars for managed_node2 30583 1726853669.36239: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853669.36243: Calling groups_plugins_play to load vars for managed_node2 30583 1726853669.36448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.36644: done with get_vars() 30583 1726853669.36697: done getting variables 30583 1726853669.36782: in VariableManager get_vars() 30583 1726853669.36790: Calling all_inventory to load vars for managed_node2 30583 1726853669.36792: Calling groups_inventory to load vars for managed_node2 30583 1726853669.36794: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853669.36798: Calling all_plugins_play to load vars for managed_node2 30583 1726853669.36801: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853669.36803: Calling groups_plugins_play to load vars for managed_node2 30583 1726853669.36926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.37101: done with get_vars() 30583 1726853669.37114: done queuing things up, now waiting for results queue to drain 30583 1726853669.37116: results queue empty 30583 1726853669.37117: checking for any_errors_fatal 30583 1726853669.37120: done checking for any_errors_fatal 30583 1726853669.37121: checking for max_fail_percentage 30583 1726853669.37122: done checking for max_fail_percentage 30583 1726853669.37123: checking to see if all hosts have failed and the running result is not ok 30583 1726853669.37124: done checking to see if all hosts have failed 30583 1726853669.37129: getting the remaining hosts for this loop 30583 1726853669.37130: done getting the remaining hosts for this loop 30583 1726853669.37133: getting the next task for host managed_node2 30583 1726853669.37137: done getting next task for host managed_node2 30583 1726853669.37139: ^ task is: TASK: Show playbook name 30583 1726853669.37140: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853669.37142: getting variables 30583 1726853669.37143: in VariableManager get_vars() 30583 1726853669.37151: Calling all_inventory to load vars for managed_node2 30583 1726853669.37153: Calling groups_inventory to load vars for managed_node2 30583 1726853669.37155: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853669.37159: Calling all_plugins_play to load vars for managed_node2 30583 1726853669.37161: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853669.37163: Calling groups_plugins_play to load vars for managed_node2 30583 1726853669.37293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.37487: done with get_vars() 30583 1726853669.37496: done getting variables 30583 1726853669.37575: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:11 Friday 20 September 2024 13:34:29 -0400 (0:00:01.140) 0:00:04.713 ****** 30583 1726853669.37600: entering _queue_task() for managed_node2/debug 30583 1726853669.37602: Creating lock for debug 30583 1726853669.37985: worker is 1 (out of 1 available) 30583 1726853669.37997: exiting _queue_task() for managed_node2/debug 30583 1726853669.38008: done queuing things up, now waiting for results queue to drain 30583 1726853669.38010: waiting for pending results... 30583 1726853669.38166: running TaskExecutor() for managed_node2/TASK: Show playbook name 30583 1726853669.38259: in run() - task 02083763-bbaf-05ea-abc5-00000000000b 30583 1726853669.38268: variable 'ansible_search_path' from source: unknown 30583 1726853669.38301: calling self._execute() 30583 1726853669.38368: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.38373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.38382: variable 'omit' from source: magic vars 30583 1726853669.38642: variable 'ansible_distribution_major_version' from source: facts 30583 1726853669.38651: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853669.38660: variable 'omit' from source: magic vars 30583 1726853669.38680: variable 'omit' from source: magic vars 30583 1726853669.38704: variable 'omit' from source: magic vars 30583 1726853669.38738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853669.38765: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853669.38784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853669.38796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.38805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.38832: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853669.38836: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.38839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.38906: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853669.38911: Set connection var ansible_timeout to 10 30583 1726853669.38914: Set connection var ansible_connection to ssh 30583 1726853669.38919: Set connection var ansible_shell_executable to /bin/sh 30583 1726853669.38921: Set connection var ansible_shell_type to sh 30583 1726853669.38928: Set connection var ansible_pipelining to False 30583 1726853669.38950: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.38953: variable 'ansible_connection' from source: unknown 30583 1726853669.38958: variable 'ansible_module_compression' from source: unknown 30583 1726853669.38961: variable 'ansible_shell_type' from source: unknown 30583 1726853669.38963: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.38966: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.38968: variable 'ansible_pipelining' from source: unknown 30583 1726853669.38969: variable 'ansible_timeout' from source: unknown 30583 1726853669.38973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.39073: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853669.39081: variable 'omit' from source: magic vars 30583 1726853669.39085: starting attempt loop 30583 1726853669.39088: running the handler 30583 1726853669.39125: handler run complete 30583 1726853669.39143: attempt loop complete, returning result 30583 1726853669.39146: _execute() done 30583 1726853669.39150: dumping result to json 30583 1726853669.39153: done dumping result, returning 30583 1726853669.39165: done running TaskExecutor() for managed_node2/TASK: Show playbook name [02083763-bbaf-05ea-abc5-00000000000b] 30583 1726853669.39167: sending task result for task 02083763-bbaf-05ea-abc5-00000000000b 30583 1726853669.39243: done sending task result for task 02083763-bbaf-05ea-abc5-00000000000b 30583 1726853669.39246: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: this is: playbooks/tests_states.yml 30583 1726853669.39297: no more pending results, returning what we have 30583 1726853669.39300: results queue empty 30583 1726853669.39301: checking for any_errors_fatal 30583 1726853669.39303: done checking for any_errors_fatal 30583 1726853669.39304: checking for max_fail_percentage 30583 1726853669.39306: done checking for max_fail_percentage 30583 1726853669.39306: checking to see if all hosts have failed and the running result is not ok 30583 1726853669.39307: done checking to see if all hosts have failed 30583 1726853669.39308: getting the remaining hosts for this loop 30583 1726853669.39310: done getting the remaining hosts for this loop 30583 1726853669.39313: getting the next task for host managed_node2 30583 1726853669.39321: done getting next task for host managed_node2 30583 1726853669.39324: ^ task is: TASK: Include the task 'run_test.yml' 30583 1726853669.39326: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853669.39329: getting variables 30583 1726853669.39330: in VariableManager get_vars() 30583 1726853669.39359: Calling all_inventory to load vars for managed_node2 30583 1726853669.39361: Calling groups_inventory to load vars for managed_node2 30583 1726853669.39364: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853669.39373: Calling all_plugins_play to load vars for managed_node2 30583 1726853669.39376: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853669.39378: Calling groups_plugins_play to load vars for managed_node2 30583 1726853669.39502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.39613: done with get_vars() 30583 1726853669.39620: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:22 Friday 20 September 2024 13:34:29 -0400 (0:00:00.020) 0:00:04.734 ****** 30583 1726853669.39681: entering _queue_task() for managed_node2/include_tasks 30583 1726853669.39881: worker is 1 (out of 1 available) 30583 1726853669.39893: exiting _queue_task() for managed_node2/include_tasks 30583 1726853669.39905: done queuing things up, now waiting for results queue to drain 30583 1726853669.39906: waiting for pending results... 30583 1726853669.40286: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 30583 1726853669.40290: in run() - task 02083763-bbaf-05ea-abc5-00000000000d 30583 1726853669.40293: variable 'ansible_search_path' from source: unknown 30583 1726853669.40296: calling self._execute() 30583 1726853669.40348: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.40363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.40381: variable 'omit' from source: magic vars 30583 1726853669.40731: variable 'ansible_distribution_major_version' from source: facts 30583 1726853669.40741: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853669.40746: _execute() done 30583 1726853669.40749: dumping result to json 30583 1726853669.40779: done dumping result, returning 30583 1726853669.40782: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [02083763-bbaf-05ea-abc5-00000000000d] 30583 1726853669.40785: sending task result for task 02083763-bbaf-05ea-abc5-00000000000d 30583 1726853669.40944: done sending task result for task 02083763-bbaf-05ea-abc5-00000000000d 30583 1726853669.40946: WORKER PROCESS EXITING 30583 1726853669.40983: no more pending results, returning what we have 30583 1726853669.40988: in VariableManager get_vars() 30583 1726853669.41019: Calling all_inventory to load vars for managed_node2 30583 1726853669.41022: Calling groups_inventory to load vars for managed_node2 30583 1726853669.41025: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853669.41036: Calling all_plugins_play to load vars for managed_node2 30583 1726853669.41038: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853669.41040: Calling groups_plugins_play to load vars for managed_node2 30583 1726853669.41261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.41461: done with get_vars() 30583 1726853669.41468: variable 'ansible_search_path' from source: unknown 30583 1726853669.41482: we have included files to process 30583 1726853669.41483: generating all_blocks data 30583 1726853669.41484: done generating all_blocks data 30583 1726853669.41485: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853669.41486: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853669.41488: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853669.42085: in VariableManager get_vars() 30583 1726853669.42096: done with get_vars() 30583 1726853669.42125: in VariableManager get_vars() 30583 1726853669.42135: done with get_vars() 30583 1726853669.42159: in VariableManager get_vars() 30583 1726853669.42169: done with get_vars() 30583 1726853669.42198: in VariableManager get_vars() 30583 1726853669.42208: done with get_vars() 30583 1726853669.42231: in VariableManager get_vars() 30583 1726853669.42240: done with get_vars() 30583 1726853669.42465: in VariableManager get_vars() 30583 1726853669.42476: done with get_vars() 30583 1726853669.42484: done processing included file 30583 1726853669.42485: iterating over new_blocks loaded from include file 30583 1726853669.42486: in VariableManager get_vars() 30583 1726853669.42491: done with get_vars() 30583 1726853669.42492: filtering new block on tags 30583 1726853669.42689: done filtering new block on tags 30583 1726853669.42691: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 30583 1726853669.42694: extending task lists for all hosts with included blocks 30583 1726853669.42717: done extending task lists 30583 1726853669.42718: done processing included files 30583 1726853669.42718: results queue empty 30583 1726853669.42719: checking for any_errors_fatal 30583 1726853669.42722: done checking for any_errors_fatal 30583 1726853669.42722: checking for max_fail_percentage 30583 1726853669.42723: done checking for max_fail_percentage 30583 1726853669.42724: checking to see if all hosts have failed and the running result is not ok 30583 1726853669.42725: done checking to see if all hosts have failed 30583 1726853669.42725: getting the remaining hosts for this loop 30583 1726853669.42726: done getting the remaining hosts for this loop 30583 1726853669.42728: getting the next task for host managed_node2 30583 1726853669.42730: done getting next task for host managed_node2 30583 1726853669.42731: ^ task is: TASK: TEST: {{ lsr_description }} 30583 1726853669.42733: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853669.42734: getting variables 30583 1726853669.42735: in VariableManager get_vars() 30583 1726853669.42740: Calling all_inventory to load vars for managed_node2 30583 1726853669.42741: Calling groups_inventory to load vars for managed_node2 30583 1726853669.42742: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853669.42746: Calling all_plugins_play to load vars for managed_node2 30583 1726853669.42747: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853669.42749: Calling groups_plugins_play to load vars for managed_node2 30583 1726853669.42826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.42932: done with get_vars() 30583 1726853669.42940: done getting variables 30583 1726853669.42966: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853669.43044: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 13:34:29 -0400 (0:00:00.033) 0:00:04.767 ****** 30583 1726853669.43075: entering _queue_task() for managed_node2/debug 30583 1726853669.43295: worker is 1 (out of 1 available) 30583 1726853669.43309: exiting _queue_task() for managed_node2/debug 30583 1726853669.43320: done queuing things up, now waiting for results queue to drain 30583 1726853669.43321: waiting for pending results... 30583 1726853669.43480: running TaskExecutor() for managed_node2/TASK: TEST: I can create a profile 30583 1726853669.43539: in run() - task 02083763-bbaf-05ea-abc5-000000000091 30583 1726853669.43551: variable 'ansible_search_path' from source: unknown 30583 1726853669.43561: variable 'ansible_search_path' from source: unknown 30583 1726853669.43591: calling self._execute() 30583 1726853669.43645: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.43651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.43665: variable 'omit' from source: magic vars 30583 1726853669.44014: variable 'ansible_distribution_major_version' from source: facts 30583 1726853669.44031: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853669.44034: variable 'omit' from source: magic vars 30583 1726853669.44176: variable 'omit' from source: magic vars 30583 1726853669.44180: variable 'lsr_description' from source: include params 30583 1726853669.44183: variable 'omit' from source: magic vars 30583 1726853669.44223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853669.44263: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853669.44291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853669.44315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.44332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.44366: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853669.44376: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.44384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.44486: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853669.44497: Set connection var ansible_timeout to 10 30583 1726853669.44503: Set connection var ansible_connection to ssh 30583 1726853669.44516: Set connection var ansible_shell_executable to /bin/sh 30583 1726853669.44523: Set connection var ansible_shell_type to sh 30583 1726853669.44537: Set connection var ansible_pipelining to False 30583 1726853669.44564: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.44574: variable 'ansible_connection' from source: unknown 30583 1726853669.44622: variable 'ansible_module_compression' from source: unknown 30583 1726853669.44625: variable 'ansible_shell_type' from source: unknown 30583 1726853669.44628: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.44630: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.44632: variable 'ansible_pipelining' from source: unknown 30583 1726853669.44634: variable 'ansible_timeout' from source: unknown 30583 1726853669.44636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.44764: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853669.44781: variable 'omit' from source: magic vars 30583 1726853669.44791: starting attempt loop 30583 1726853669.44799: running the handler 30583 1726853669.44890: handler run complete 30583 1726853669.44893: attempt loop complete, returning result 30583 1726853669.44896: _execute() done 30583 1726853669.44898: dumping result to json 30583 1726853669.44909: done dumping result, returning 30583 1726853669.44912: done running TaskExecutor() for managed_node2/TASK: TEST: I can create a profile [02083763-bbaf-05ea-abc5-000000000091] 30583 1726853669.44914: sending task result for task 02083763-bbaf-05ea-abc5-000000000091 30583 1726853669.45025: done sending task result for task 02083763-bbaf-05ea-abc5-000000000091 30583 1726853669.45029: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I can create a profile ########## 30583 1726853669.45078: no more pending results, returning what we have 30583 1726853669.45082: results queue empty 30583 1726853669.45083: checking for any_errors_fatal 30583 1726853669.45084: done checking for any_errors_fatal 30583 1726853669.45085: checking for max_fail_percentage 30583 1726853669.45086: done checking for max_fail_percentage 30583 1726853669.45087: checking to see if all hosts have failed and the running result is not ok 30583 1726853669.45088: done checking to see if all hosts have failed 30583 1726853669.45088: getting the remaining hosts for this loop 30583 1726853669.45091: done getting the remaining hosts for this loop 30583 1726853669.45095: getting the next task for host managed_node2 30583 1726853669.45101: done getting next task for host managed_node2 30583 1726853669.45103: ^ task is: TASK: Show item 30583 1726853669.45106: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853669.45109: getting variables 30583 1726853669.45110: in VariableManager get_vars() 30583 1726853669.45136: Calling all_inventory to load vars for managed_node2 30583 1726853669.45138: Calling groups_inventory to load vars for managed_node2 30583 1726853669.45141: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853669.45149: Calling all_plugins_play to load vars for managed_node2 30583 1726853669.45151: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853669.45154: Calling groups_plugins_play to load vars for managed_node2 30583 1726853669.45332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.45453: done with get_vars() 30583 1726853669.45460: done getting variables 30583 1726853669.45505: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 13:34:29 -0400 (0:00:00.024) 0:00:04.792 ****** 30583 1726853669.45524: entering _queue_task() for managed_node2/debug 30583 1726853669.45727: worker is 1 (out of 1 available) 30583 1726853669.45740: exiting _queue_task() for managed_node2/debug 30583 1726853669.45752: done queuing things up, now waiting for results queue to drain 30583 1726853669.45753: waiting for pending results... 30583 1726853669.45905: running TaskExecutor() for managed_node2/TASK: Show item 30583 1726853669.45966: in run() - task 02083763-bbaf-05ea-abc5-000000000092 30583 1726853669.45989: variable 'ansible_search_path' from source: unknown 30583 1726853669.45993: variable 'ansible_search_path' from source: unknown 30583 1726853669.46024: variable 'omit' from source: magic vars 30583 1726853669.46121: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.46128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.46136: variable 'omit' from source: magic vars 30583 1726853669.46389: variable 'ansible_distribution_major_version' from source: facts 30583 1726853669.46398: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853669.46406: variable 'omit' from source: magic vars 30583 1726853669.46434: variable 'omit' from source: magic vars 30583 1726853669.46464: variable 'item' from source: unknown 30583 1726853669.46518: variable 'item' from source: unknown 30583 1726853669.46535: variable 'omit' from source: magic vars 30583 1726853669.46567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853669.46595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853669.46609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853669.46622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.46633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.46661: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853669.46665: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.46667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.46732: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853669.46737: Set connection var ansible_timeout to 10 30583 1726853669.46750: Set connection var ansible_connection to ssh 30583 1726853669.46753: Set connection var ansible_shell_executable to /bin/sh 30583 1726853669.46758: Set connection var ansible_shell_type to sh 30583 1726853669.46763: Set connection var ansible_pipelining to False 30583 1726853669.46781: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.46784: variable 'ansible_connection' from source: unknown 30583 1726853669.46786: variable 'ansible_module_compression' from source: unknown 30583 1726853669.46788: variable 'ansible_shell_type' from source: unknown 30583 1726853669.46791: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.46793: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.46797: variable 'ansible_pipelining' from source: unknown 30583 1726853669.46800: variable 'ansible_timeout' from source: unknown 30583 1726853669.46803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.46905: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853669.46913: variable 'omit' from source: magic vars 30583 1726853669.46918: starting attempt loop 30583 1726853669.46921: running the handler 30583 1726853669.46959: variable 'lsr_description' from source: include params 30583 1726853669.47003: variable 'lsr_description' from source: include params 30583 1726853669.47010: handler run complete 30583 1726853669.47024: attempt loop complete, returning result 30583 1726853669.47036: variable 'item' from source: unknown 30583 1726853669.47083: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile" } 30583 1726853669.47218: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.47221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.47223: variable 'omit' from source: magic vars 30583 1726853669.47293: variable 'ansible_distribution_major_version' from source: facts 30583 1726853669.47296: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853669.47301: variable 'omit' from source: magic vars 30583 1726853669.47312: variable 'omit' from source: magic vars 30583 1726853669.47341: variable 'item' from source: unknown 30583 1726853669.47473: variable 'item' from source: unknown 30583 1726853669.47477: variable 'omit' from source: magic vars 30583 1726853669.47479: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853669.47482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.47484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.47486: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853669.47489: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.47491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.47675: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853669.47679: Set connection var ansible_timeout to 10 30583 1726853669.47681: Set connection var ansible_connection to ssh 30583 1726853669.47683: Set connection var ansible_shell_executable to /bin/sh 30583 1726853669.47685: Set connection var ansible_shell_type to sh 30583 1726853669.47687: Set connection var ansible_pipelining to False 30583 1726853669.47688: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.47690: variable 'ansible_connection' from source: unknown 30583 1726853669.47692: variable 'ansible_module_compression' from source: unknown 30583 1726853669.47694: variable 'ansible_shell_type' from source: unknown 30583 1726853669.47696: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.47698: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.47700: variable 'ansible_pipelining' from source: unknown 30583 1726853669.47701: variable 'ansible_timeout' from source: unknown 30583 1726853669.47703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.47705: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853669.47707: variable 'omit' from source: magic vars 30583 1726853669.47709: starting attempt loop 30583 1726853669.47711: running the handler 30583 1726853669.47720: variable 'lsr_setup' from source: include params 30583 1726853669.47784: variable 'lsr_setup' from source: include params 30583 1726853669.47829: handler run complete 30583 1726853669.47850: attempt loop complete, returning result 30583 1726853669.47868: variable 'item' from source: unknown 30583 1726853669.47932: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 30583 1726853669.48177: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.48181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.48183: variable 'omit' from source: magic vars 30583 1726853669.48248: variable 'ansible_distribution_major_version' from source: facts 30583 1726853669.48258: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853669.48266: variable 'omit' from source: magic vars 30583 1726853669.48286: variable 'omit' from source: magic vars 30583 1726853669.48325: variable 'item' from source: unknown 30583 1726853669.48387: variable 'item' from source: unknown 30583 1726853669.48406: variable 'omit' from source: magic vars 30583 1726853669.48427: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853669.48440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.48450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.48466: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853669.48477: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.48485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.48554: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853669.48566: Set connection var ansible_timeout to 10 30583 1726853669.48574: Set connection var ansible_connection to ssh 30583 1726853669.48583: Set connection var ansible_shell_executable to /bin/sh 30583 1726853669.48589: Set connection var ansible_shell_type to sh 30583 1726853669.48677: Set connection var ansible_pipelining to False 30583 1726853669.48680: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.48682: variable 'ansible_connection' from source: unknown 30583 1726853669.48684: variable 'ansible_module_compression' from source: unknown 30583 1726853669.48686: variable 'ansible_shell_type' from source: unknown 30583 1726853669.48688: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.48690: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.48692: variable 'ansible_pipelining' from source: unknown 30583 1726853669.48694: variable 'ansible_timeout' from source: unknown 30583 1726853669.48696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.48741: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853669.48755: variable 'omit' from source: magic vars 30583 1726853669.48769: starting attempt loop 30583 1726853669.48778: running the handler 30583 1726853669.48798: variable 'lsr_test' from source: include params 30583 1726853669.48843: variable 'lsr_test' from source: include params 30583 1726853669.48861: handler run complete 30583 1726853669.48867: attempt loop complete, returning result 30583 1726853669.48882: variable 'item' from source: unknown 30583 1726853669.48932: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile.yml" ] } 30583 1726853669.49019: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.49022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.49025: variable 'omit' from source: magic vars 30583 1726853669.49110: variable 'ansible_distribution_major_version' from source: facts 30583 1726853669.49114: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853669.49118: variable 'omit' from source: magic vars 30583 1726853669.49131: variable 'omit' from source: magic vars 30583 1726853669.49158: variable 'item' from source: unknown 30583 1726853669.49204: variable 'item' from source: unknown 30583 1726853669.49212: variable 'omit' from source: magic vars 30583 1726853669.49225: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853669.49233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.49236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.49246: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853669.49249: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.49251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.49300: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853669.49303: Set connection var ansible_timeout to 10 30583 1726853669.49307: Set connection var ansible_connection to ssh 30583 1726853669.49310: Set connection var ansible_shell_executable to /bin/sh 30583 1726853669.49312: Set connection var ansible_shell_type to sh 30583 1726853669.49322: Set connection var ansible_pipelining to False 30583 1726853669.49335: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.49343: variable 'ansible_connection' from source: unknown 30583 1726853669.49347: variable 'ansible_module_compression' from source: unknown 30583 1726853669.49349: variable 'ansible_shell_type' from source: unknown 30583 1726853669.49351: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.49353: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.49358: variable 'ansible_pipelining' from source: unknown 30583 1726853669.49360: variable 'ansible_timeout' from source: unknown 30583 1726853669.49362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.49413: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853669.49419: variable 'omit' from source: magic vars 30583 1726853669.49423: starting attempt loop 30583 1726853669.49426: running the handler 30583 1726853669.49442: variable 'lsr_assert' from source: include params 30583 1726853669.49488: variable 'lsr_assert' from source: include params 30583 1726853669.49500: handler run complete 30583 1726853669.49510: attempt loop complete, returning result 30583 1726853669.49520: variable 'item' from source: unknown 30583 1726853669.49592: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_present.yml" ] } 30583 1726853669.49667: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.49670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.49674: variable 'omit' from source: magic vars 30583 1726853669.49765: variable 'ansible_distribution_major_version' from source: facts 30583 1726853669.49768: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853669.49777: variable 'omit' from source: magic vars 30583 1726853669.49790: variable 'omit' from source: magic vars 30583 1726853669.49814: variable 'item' from source: unknown 30583 1726853669.49858: variable 'item' from source: unknown 30583 1726853669.49869: variable 'omit' from source: magic vars 30583 1726853669.49882: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853669.49889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.49897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.49909: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853669.49911: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.49913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.49952: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853669.49957: Set connection var ansible_timeout to 10 30583 1726853669.49960: Set connection var ansible_connection to ssh 30583 1726853669.49962: Set connection var ansible_shell_executable to /bin/sh 30583 1726853669.49965: Set connection var ansible_shell_type to sh 30583 1726853669.49974: Set connection var ansible_pipelining to False 30583 1726853669.49988: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.49990: variable 'ansible_connection' from source: unknown 30583 1726853669.49993: variable 'ansible_module_compression' from source: unknown 30583 1726853669.49995: variable 'ansible_shell_type' from source: unknown 30583 1726853669.49997: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.49999: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.50003: variable 'ansible_pipelining' from source: unknown 30583 1726853669.50005: variable 'ansible_timeout' from source: unknown 30583 1726853669.50018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.50067: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853669.50074: variable 'omit' from source: magic vars 30583 1726853669.50079: starting attempt loop 30583 1726853669.50081: running the handler 30583 1726853669.50096: variable 'lsr_assert_when' from source: include params 30583 1726853669.50141: variable 'lsr_assert_when' from source: include params 30583 1726853669.50199: variable 'network_provider' from source: set_fact 30583 1726853669.50222: handler run complete 30583 1726853669.50238: attempt loop complete, returning result 30583 1726853669.50248: variable 'item' from source: unknown 30583 1726853669.50292: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_present.yml" } ] } 30583 1726853669.50373: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.50377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.50379: variable 'omit' from source: magic vars 30583 1726853669.50466: variable 'ansible_distribution_major_version' from source: facts 30583 1726853669.50469: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853669.50474: variable 'omit' from source: magic vars 30583 1726853669.50485: variable 'omit' from source: magic vars 30583 1726853669.50514: variable 'item' from source: unknown 30583 1726853669.50557: variable 'item' from source: unknown 30583 1726853669.50566: variable 'omit' from source: magic vars 30583 1726853669.50580: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853669.50586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.50592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.50610: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853669.50612: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.50615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.50652: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853669.50658: Set connection var ansible_timeout to 10 30583 1726853669.50661: Set connection var ansible_connection to ssh 30583 1726853669.50663: Set connection var ansible_shell_executable to /bin/sh 30583 1726853669.50665: Set connection var ansible_shell_type to sh 30583 1726853669.50821: Set connection var ansible_pipelining to False 30583 1726853669.50824: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.50827: variable 'ansible_connection' from source: unknown 30583 1726853669.50829: variable 'ansible_module_compression' from source: unknown 30583 1726853669.50831: variable 'ansible_shell_type' from source: unknown 30583 1726853669.50833: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.50835: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.50837: variable 'ansible_pipelining' from source: unknown 30583 1726853669.50839: variable 'ansible_timeout' from source: unknown 30583 1726853669.50841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.50843: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853669.50845: variable 'omit' from source: magic vars 30583 1726853669.50847: starting attempt loop 30583 1726853669.50849: running the handler 30583 1726853669.50851: variable 'lsr_fail_debug' from source: play vars 30583 1726853669.50977: variable 'lsr_fail_debug' from source: play vars 30583 1726853669.50980: handler run complete 30583 1726853669.50982: attempt loop complete, returning result 30583 1726853669.50985: variable 'item' from source: unknown 30583 1726853669.50988: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30583 1726853669.51177: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.51180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.51182: variable 'omit' from source: magic vars 30583 1726853669.51279: variable 'ansible_distribution_major_version' from source: facts 30583 1726853669.51287: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853669.51294: variable 'omit' from source: magic vars 30583 1726853669.51307: variable 'omit' from source: magic vars 30583 1726853669.51341: variable 'item' from source: unknown 30583 1726853669.51397: variable 'item' from source: unknown 30583 1726853669.51413: variable 'omit' from source: magic vars 30583 1726853669.51431: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853669.51475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.51478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.51480: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853669.51482: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.51483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.51538: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853669.51548: Set connection var ansible_timeout to 10 30583 1726853669.51554: Set connection var ansible_connection to ssh 30583 1726853669.51576: Set connection var ansible_shell_executable to /bin/sh 30583 1726853669.51579: Set connection var ansible_shell_type to sh 30583 1726853669.51581: Set connection var ansible_pipelining to False 30583 1726853669.51676: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.51679: variable 'ansible_connection' from source: unknown 30583 1726853669.51681: variable 'ansible_module_compression' from source: unknown 30583 1726853669.51683: variable 'ansible_shell_type' from source: unknown 30583 1726853669.51685: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.51687: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.51688: variable 'ansible_pipelining' from source: unknown 30583 1726853669.51691: variable 'ansible_timeout' from source: unknown 30583 1726853669.51693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.51728: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853669.51739: variable 'omit' from source: magic vars 30583 1726853669.51747: starting attempt loop 30583 1726853669.51753: running the handler 30583 1726853669.51776: variable 'lsr_cleanup' from source: include params 30583 1726853669.51837: variable 'lsr_cleanup' from source: include params 30583 1726853669.51856: handler run complete 30583 1726853669.51875: attempt loop complete, returning result 30583 1726853669.51892: variable 'item' from source: unknown 30583 1726853669.51945: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30583 1726853669.52176: dumping result to json 30583 1726853669.52179: done dumping result, returning 30583 1726853669.52182: done running TaskExecutor() for managed_node2/TASK: Show item [02083763-bbaf-05ea-abc5-000000000092] 30583 1726853669.52184: sending task result for task 02083763-bbaf-05ea-abc5-000000000092 30583 1726853669.52228: done sending task result for task 02083763-bbaf-05ea-abc5-000000000092 30583 1726853669.52231: WORKER PROCESS EXITING 30583 1726853669.52288: no more pending results, returning what we have 30583 1726853669.52292: results queue empty 30583 1726853669.52293: checking for any_errors_fatal 30583 1726853669.52299: done checking for any_errors_fatal 30583 1726853669.52299: checking for max_fail_percentage 30583 1726853669.52301: done checking for max_fail_percentage 30583 1726853669.52302: checking to see if all hosts have failed and the running result is not ok 30583 1726853669.52302: done checking to see if all hosts have failed 30583 1726853669.52303: getting the remaining hosts for this loop 30583 1726853669.52305: done getting the remaining hosts for this loop 30583 1726853669.52308: getting the next task for host managed_node2 30583 1726853669.52317: done getting next task for host managed_node2 30583 1726853669.52320: ^ task is: TASK: Include the task 'show_interfaces.yml' 30583 1726853669.52322: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853669.52326: getting variables 30583 1726853669.52328: in VariableManager get_vars() 30583 1726853669.52352: Calling all_inventory to load vars for managed_node2 30583 1726853669.52357: Calling groups_inventory to load vars for managed_node2 30583 1726853669.52359: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853669.52368: Calling all_plugins_play to load vars for managed_node2 30583 1726853669.52370: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853669.52374: Calling groups_plugins_play to load vars for managed_node2 30583 1726853669.52661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.52860: done with get_vars() 30583 1726853669.52870: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 13:34:29 -0400 (0:00:00.074) 0:00:04.866 ****** 30583 1726853669.52953: entering _queue_task() for managed_node2/include_tasks 30583 1726853669.53403: worker is 1 (out of 1 available) 30583 1726853669.53411: exiting _queue_task() for managed_node2/include_tasks 30583 1726853669.53420: done queuing things up, now waiting for results queue to drain 30583 1726853669.53422: waiting for pending results... 30583 1726853669.53488: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 30583 1726853669.53602: in run() - task 02083763-bbaf-05ea-abc5-000000000093 30583 1726853669.53646: variable 'ansible_search_path' from source: unknown 30583 1726853669.53650: variable 'ansible_search_path' from source: unknown 30583 1726853669.53672: calling self._execute() 30583 1726853669.53746: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.53877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.53880: variable 'omit' from source: magic vars 30583 1726853669.54118: variable 'ansible_distribution_major_version' from source: facts 30583 1726853669.54133: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853669.54144: _execute() done 30583 1726853669.54150: dumping result to json 30583 1726853669.54157: done dumping result, returning 30583 1726853669.54167: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-05ea-abc5-000000000093] 30583 1726853669.54178: sending task result for task 02083763-bbaf-05ea-abc5-000000000093 30583 1726853669.54343: no more pending results, returning what we have 30583 1726853669.54349: in VariableManager get_vars() 30583 1726853669.54384: Calling all_inventory to load vars for managed_node2 30583 1726853669.54387: Calling groups_inventory to load vars for managed_node2 30583 1726853669.54391: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853669.54405: Calling all_plugins_play to load vars for managed_node2 30583 1726853669.54408: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853669.54411: Calling groups_plugins_play to load vars for managed_node2 30583 1726853669.54769: done sending task result for task 02083763-bbaf-05ea-abc5-000000000093 30583 1726853669.54774: WORKER PROCESS EXITING 30583 1726853669.54796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.54985: done with get_vars() 30583 1726853669.54992: variable 'ansible_search_path' from source: unknown 30583 1726853669.54994: variable 'ansible_search_path' from source: unknown 30583 1726853669.55033: we have included files to process 30583 1726853669.55034: generating all_blocks data 30583 1726853669.55035: done generating all_blocks data 30583 1726853669.55040: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853669.55041: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853669.55044: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853669.55191: in VariableManager get_vars() 30583 1726853669.55208: done with get_vars() 30583 1726853669.55315: done processing included file 30583 1726853669.55317: iterating over new_blocks loaded from include file 30583 1726853669.55319: in VariableManager get_vars() 30583 1726853669.55331: done with get_vars() 30583 1726853669.55332: filtering new block on tags 30583 1726853669.55365: done filtering new block on tags 30583 1726853669.55367: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 30583 1726853669.55373: extending task lists for all hosts with included blocks 30583 1726853669.55833: done extending task lists 30583 1726853669.55834: done processing included files 30583 1726853669.55835: results queue empty 30583 1726853669.55836: checking for any_errors_fatal 30583 1726853669.55840: done checking for any_errors_fatal 30583 1726853669.55841: checking for max_fail_percentage 30583 1726853669.55842: done checking for max_fail_percentage 30583 1726853669.55843: checking to see if all hosts have failed and the running result is not ok 30583 1726853669.55843: done checking to see if all hosts have failed 30583 1726853669.55844: getting the remaining hosts for this loop 30583 1726853669.55845: done getting the remaining hosts for this loop 30583 1726853669.55847: getting the next task for host managed_node2 30583 1726853669.55851: done getting next task for host managed_node2 30583 1726853669.55853: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30583 1726853669.55855: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853669.55857: getting variables 30583 1726853669.55858: in VariableManager get_vars() 30583 1726853669.55866: Calling all_inventory to load vars for managed_node2 30583 1726853669.55867: Calling groups_inventory to load vars for managed_node2 30583 1726853669.55869: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853669.55875: Calling all_plugins_play to load vars for managed_node2 30583 1726853669.55877: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853669.55879: Calling groups_plugins_play to load vars for managed_node2 30583 1726853669.55993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.56182: done with get_vars() 30583 1726853669.56192: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:34:29 -0400 (0:00:00.033) 0:00:04.899 ****** 30583 1726853669.56268: entering _queue_task() for managed_node2/include_tasks 30583 1726853669.56570: worker is 1 (out of 1 available) 30583 1726853669.56686: exiting _queue_task() for managed_node2/include_tasks 30583 1726853669.56699: done queuing things up, now waiting for results queue to drain 30583 1726853669.56700: waiting for pending results... 30583 1726853669.56867: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 30583 1726853669.56989: in run() - task 02083763-bbaf-05ea-abc5-0000000000ba 30583 1726853669.57009: variable 'ansible_search_path' from source: unknown 30583 1726853669.57017: variable 'ansible_search_path' from source: unknown 30583 1726853669.57062: calling self._execute() 30583 1726853669.57142: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.57157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.57174: variable 'omit' from source: magic vars 30583 1726853669.57526: variable 'ansible_distribution_major_version' from source: facts 30583 1726853669.57542: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853669.57776: _execute() done 30583 1726853669.57780: dumping result to json 30583 1726853669.57783: done dumping result, returning 30583 1726853669.57786: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-05ea-abc5-0000000000ba] 30583 1726853669.57788: sending task result for task 02083763-bbaf-05ea-abc5-0000000000ba 30583 1726853669.57855: done sending task result for task 02083763-bbaf-05ea-abc5-0000000000ba 30583 1726853669.57858: WORKER PROCESS EXITING 30583 1726853669.57880: no more pending results, returning what we have 30583 1726853669.57884: in VariableManager get_vars() 30583 1726853669.57908: Calling all_inventory to load vars for managed_node2 30583 1726853669.57910: Calling groups_inventory to load vars for managed_node2 30583 1726853669.57913: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853669.57921: Calling all_plugins_play to load vars for managed_node2 30583 1726853669.57923: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853669.57925: Calling groups_plugins_play to load vars for managed_node2 30583 1726853669.58088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.58355: done with get_vars() 30583 1726853669.58362: variable 'ansible_search_path' from source: unknown 30583 1726853669.58363: variable 'ansible_search_path' from source: unknown 30583 1726853669.58397: we have included files to process 30583 1726853669.58399: generating all_blocks data 30583 1726853669.58400: done generating all_blocks data 30583 1726853669.58401: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853669.58402: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853669.58404: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853669.58707: done processing included file 30583 1726853669.58709: iterating over new_blocks loaded from include file 30583 1726853669.58711: in VariableManager get_vars() 30583 1726853669.58726: done with get_vars() 30583 1726853669.58727: filtering new block on tags 30583 1726853669.58760: done filtering new block on tags 30583 1726853669.58763: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 30583 1726853669.58767: extending task lists for all hosts with included blocks 30583 1726853669.58917: done extending task lists 30583 1726853669.58918: done processing included files 30583 1726853669.58919: results queue empty 30583 1726853669.58919: checking for any_errors_fatal 30583 1726853669.58922: done checking for any_errors_fatal 30583 1726853669.58923: checking for max_fail_percentage 30583 1726853669.58924: done checking for max_fail_percentage 30583 1726853669.58925: checking to see if all hosts have failed and the running result is not ok 30583 1726853669.58925: done checking to see if all hosts have failed 30583 1726853669.58926: getting the remaining hosts for this loop 30583 1726853669.58927: done getting the remaining hosts for this loop 30583 1726853669.58929: getting the next task for host managed_node2 30583 1726853669.58934: done getting next task for host managed_node2 30583 1726853669.58936: ^ task is: TASK: Gather current interface info 30583 1726853669.58939: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853669.58941: getting variables 30583 1726853669.58942: in VariableManager get_vars() 30583 1726853669.58949: Calling all_inventory to load vars for managed_node2 30583 1726853669.58951: Calling groups_inventory to load vars for managed_node2 30583 1726853669.58954: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853669.58958: Calling all_plugins_play to load vars for managed_node2 30583 1726853669.58960: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853669.58963: Calling groups_plugins_play to load vars for managed_node2 30583 1726853669.59101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853669.59310: done with get_vars() 30583 1726853669.59318: done getting variables 30583 1726853669.59353: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:34:29 -0400 (0:00:00.031) 0:00:04.931 ****** 30583 1726853669.59383: entering _queue_task() for managed_node2/command 30583 1726853669.59631: worker is 1 (out of 1 available) 30583 1726853669.59644: exiting _queue_task() for managed_node2/command 30583 1726853669.59655: done queuing things up, now waiting for results queue to drain 30583 1726853669.59656: waiting for pending results... 30583 1726853669.59900: running TaskExecutor() for managed_node2/TASK: Gather current interface info 30583 1726853669.60013: in run() - task 02083763-bbaf-05ea-abc5-0000000000f5 30583 1726853669.60032: variable 'ansible_search_path' from source: unknown 30583 1726853669.60039: variable 'ansible_search_path' from source: unknown 30583 1726853669.60095: calling self._execute() 30583 1726853669.60151: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.60162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.60202: variable 'omit' from source: magic vars 30583 1726853669.60526: variable 'ansible_distribution_major_version' from source: facts 30583 1726853669.60544: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853669.60675: variable 'omit' from source: magic vars 30583 1726853669.60679: variable 'omit' from source: magic vars 30583 1726853669.60681: variable 'omit' from source: magic vars 30583 1726853669.60687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853669.60724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853669.60748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853669.60768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.60786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853669.60821: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853669.60830: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.60837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.60940: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853669.60951: Set connection var ansible_timeout to 10 30583 1726853669.60958: Set connection var ansible_connection to ssh 30583 1726853669.60967: Set connection var ansible_shell_executable to /bin/sh 30583 1726853669.60975: Set connection var ansible_shell_type to sh 30583 1726853669.60988: Set connection var ansible_pipelining to False 30583 1726853669.61018: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.61025: variable 'ansible_connection' from source: unknown 30583 1726853669.61033: variable 'ansible_module_compression' from source: unknown 30583 1726853669.61039: variable 'ansible_shell_type' from source: unknown 30583 1726853669.61045: variable 'ansible_shell_executable' from source: unknown 30583 1726853669.61051: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853669.61058: variable 'ansible_pipelining' from source: unknown 30583 1726853669.61064: variable 'ansible_timeout' from source: unknown 30583 1726853669.61119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853669.61212: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853669.61232: variable 'omit' from source: magic vars 30583 1726853669.61242: starting attempt loop 30583 1726853669.61248: running the handler 30583 1726853669.61266: _low_level_execute_command(): starting 30583 1726853669.61279: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853669.61981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853669.61999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853669.62021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853669.62129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853669.62180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853669.62257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853669.64711: stdout chunk (state=3): >>>/root <<< 30583 1726853669.64884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853669.64918: stderr chunk (state=3): >>><<< 30583 1726853669.64933: stdout chunk (state=3): >>><<< 30583 1726853669.64962: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30583 1726853669.64983: _low_level_execute_command(): starting 30583 1726853669.64993: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838 `" && echo ansible-tmp-1726853669.6496892-30811-231172804137838="` echo /root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838 `" ) && sleep 0' 30583 1726853669.65685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853669.65793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853669.65831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853669.65945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30583 1726853669.68827: stdout chunk (state=3): >>>ansible-tmp-1726853669.6496892-30811-231172804137838=/root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838 <<< 30583 1726853669.69038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853669.69050: stdout chunk (state=3): >>><<< 30583 1726853669.69073: stderr chunk (state=3): >>><<< 30583 1726853669.69278: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853669.6496892-30811-231172804137838=/root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30583 1726853669.69282: variable 'ansible_module_compression' from source: unknown 30583 1726853669.69285: ANSIBALLZ: Using generic lock for ansible.legacy.command 30583 1726853669.69288: ANSIBALLZ: Acquiring lock 30583 1726853669.69290: ANSIBALLZ: Lock acquired: 139827455545936 30583 1726853669.69292: ANSIBALLZ: Creating module 30583 1726853669.84032: ANSIBALLZ: Writing module into payload 30583 1726853669.84098: ANSIBALLZ: Writing module 30583 1726853669.84112: ANSIBALLZ: Renaming module 30583 1726853669.84116: ANSIBALLZ: Done creating module 30583 1726853669.84131: variable 'ansible_facts' from source: unknown 30583 1726853669.84176: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838/AnsiballZ_command.py 30583 1726853669.84278: Sending initial data 30583 1726853669.84282: Sent initial data (156 bytes) 30583 1726853669.84720: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853669.84724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853669.84727: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853669.84729: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853669.84741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853669.84786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853669.84808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853669.84907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 1 <<< 30583 1726853669.87078: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853669.87170: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853669.87260: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmptodc7dws /root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838/AnsiballZ_command.py <<< 30583 1726853669.87265: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838/AnsiballZ_command.py" <<< 30583 1726853669.87347: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmptodc7dws" to remote "/root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838/AnsiballZ_command.py" <<< 30583 1726853669.88454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853669.88457: stderr chunk (state=3): >>><<< 30583 1726853669.88460: stdout chunk (state=3): >>><<< 30583 1726853669.88463: done transferring module to remote 30583 1726853669.88466: _low_level_execute_command(): starting 30583 1726853669.88468: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838/ /root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838/AnsiballZ_command.py && sleep 0' 30583 1726853669.89559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853669.89808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853669.89892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853669.89967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853669.92624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853669.92669: stderr chunk (state=3): >>><<< 30583 1726853669.92675: stdout chunk (state=3): >>><<< 30583 1726853669.92796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853669.92800: _low_level_execute_command(): starting 30583 1726853669.92802: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838/AnsiballZ_command.py && sleep 0' 30583 1726853669.93813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853669.93829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853669.93858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853669.93921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853669.93969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853669.94125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853669.94247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853669.94360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853670.18732: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:34:30.181023", "end": "2024-09-20 13:34:30.186168", "delta": "0:00:00.005145", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853670.21138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853670.21164: stderr chunk (state=3): >>><<< 30583 1726853670.21168: stdout chunk (state=3): >>><<< 30583 1726853670.21189: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:34:30.181023", "end": "2024-09-20 13:34:30.186168", "delta": "0:00:00.005145", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853670.21216: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853670.21223: _low_level_execute_command(): starting 30583 1726853670.21228: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853669.6496892-30811-231172804137838/ > /dev/null 2>&1 && sleep 0' 30583 1726853670.21647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853670.21651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853670.21685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853670.21689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.21691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853670.21693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853670.21695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.21753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853670.21756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853670.21761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853670.21839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853670.24577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853670.24581: stdout chunk (state=3): >>><<< 30583 1726853670.24584: stderr chunk (state=3): >>><<< 30583 1726853670.24586: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853670.24589: handler run complete 30583 1726853670.24591: Evaluated conditional (False): False 30583 1726853670.24601: attempt loop complete, returning result 30583 1726853670.24604: _execute() done 30583 1726853670.24606: dumping result to json 30583 1726853670.24611: done dumping result, returning 30583 1726853670.24622: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [02083763-bbaf-05ea-abc5-0000000000f5] 30583 1726853670.24625: sending task result for task 02083763-bbaf-05ea-abc5-0000000000f5 30583 1726853670.24739: done sending task result for task 02083763-bbaf-05ea-abc5-0000000000f5 30583 1726853670.24743: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.005145", "end": "2024-09-20 13:34:30.186168", "rc": 0, "start": "2024-09-20 13:34:30.181023" } STDOUT: bonding_masters eth0 lo 30583 1726853670.24826: no more pending results, returning what we have 30583 1726853670.24830: results queue empty 30583 1726853670.24831: checking for any_errors_fatal 30583 1726853670.24833: done checking for any_errors_fatal 30583 1726853670.24834: checking for max_fail_percentage 30583 1726853670.24835: done checking for max_fail_percentage 30583 1726853670.24836: checking to see if all hosts have failed and the running result is not ok 30583 1726853670.24837: done checking to see if all hosts have failed 30583 1726853670.24838: getting the remaining hosts for this loop 30583 1726853670.24839: done getting the remaining hosts for this loop 30583 1726853670.24843: getting the next task for host managed_node2 30583 1726853670.24850: done getting next task for host managed_node2 30583 1726853670.24853: ^ task is: TASK: Set current_interfaces 30583 1726853670.24860: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853670.24864: getting variables 30583 1726853670.24866: in VariableManager get_vars() 30583 1726853670.24896: Calling all_inventory to load vars for managed_node2 30583 1726853670.24899: Calling groups_inventory to load vars for managed_node2 30583 1726853670.24903: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853670.24914: Calling all_plugins_play to load vars for managed_node2 30583 1726853670.24917: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853670.24920: Calling groups_plugins_play to load vars for managed_node2 30583 1726853670.25412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853670.25651: done with get_vars() 30583 1726853670.25663: done getting variables 30583 1726853670.25725: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:34:30 -0400 (0:00:00.663) 0:00:05.594 ****** 30583 1726853670.25760: entering _queue_task() for managed_node2/set_fact 30583 1726853670.26017: worker is 1 (out of 1 available) 30583 1726853670.26030: exiting _queue_task() for managed_node2/set_fact 30583 1726853670.26157: done queuing things up, now waiting for results queue to drain 30583 1726853670.26159: waiting for pending results... 30583 1726853670.26493: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 30583 1726853670.26498: in run() - task 02083763-bbaf-05ea-abc5-0000000000f6 30583 1726853670.26502: variable 'ansible_search_path' from source: unknown 30583 1726853670.26504: variable 'ansible_search_path' from source: unknown 30583 1726853670.26507: calling self._execute() 30583 1726853670.26599: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.26610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.26623: variable 'omit' from source: magic vars 30583 1726853670.27040: variable 'ansible_distribution_major_version' from source: facts 30583 1726853670.27059: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853670.27124: variable 'omit' from source: magic vars 30583 1726853670.27131: variable 'omit' from source: magic vars 30583 1726853670.27263: variable '_current_interfaces' from source: set_fact 30583 1726853670.27412: variable 'omit' from source: magic vars 30583 1726853670.27470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853670.27511: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853670.27535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853670.27565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853670.27589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853670.27674: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853670.27681: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.27684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.27760: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853670.27783: Set connection var ansible_timeout to 10 30583 1726853670.27796: Set connection var ansible_connection to ssh 30583 1726853670.27807: Set connection var ansible_shell_executable to /bin/sh 30583 1726853670.27813: Set connection var ansible_shell_type to sh 30583 1726853670.27827: Set connection var ansible_pipelining to False 30583 1726853670.27889: variable 'ansible_shell_executable' from source: unknown 30583 1726853670.27892: variable 'ansible_connection' from source: unknown 30583 1726853670.27898: variable 'ansible_module_compression' from source: unknown 30583 1726853670.27901: variable 'ansible_shell_type' from source: unknown 30583 1726853670.27903: variable 'ansible_shell_executable' from source: unknown 30583 1726853670.27905: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.27907: variable 'ansible_pipelining' from source: unknown 30583 1726853670.27908: variable 'ansible_timeout' from source: unknown 30583 1726853670.27912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.28108: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853670.28117: variable 'omit' from source: magic vars 30583 1726853670.28120: starting attempt loop 30583 1726853670.28122: running the handler 30583 1726853670.28124: handler run complete 30583 1726853670.28136: attempt loop complete, returning result 30583 1726853670.28143: _execute() done 30583 1726853670.28148: dumping result to json 30583 1726853670.28159: done dumping result, returning 30583 1726853670.28174: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [02083763-bbaf-05ea-abc5-0000000000f6] 30583 1726853670.28215: sending task result for task 02083763-bbaf-05ea-abc5-0000000000f6 30583 1726853670.28281: done sending task result for task 02083763-bbaf-05ea-abc5-0000000000f6 30583 1726853670.28284: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30583 1726853670.28385: no more pending results, returning what we have 30583 1726853670.28389: results queue empty 30583 1726853670.28390: checking for any_errors_fatal 30583 1726853670.28398: done checking for any_errors_fatal 30583 1726853670.28399: checking for max_fail_percentage 30583 1726853670.28401: done checking for max_fail_percentage 30583 1726853670.28402: checking to see if all hosts have failed and the running result is not ok 30583 1726853670.28403: done checking to see if all hosts have failed 30583 1726853670.28403: getting the remaining hosts for this loop 30583 1726853670.28405: done getting the remaining hosts for this loop 30583 1726853670.28410: getting the next task for host managed_node2 30583 1726853670.28419: done getting next task for host managed_node2 30583 1726853670.28421: ^ task is: TASK: Show current_interfaces 30583 1726853670.28427: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853670.28433: getting variables 30583 1726853670.28435: in VariableManager get_vars() 30583 1726853670.28468: Calling all_inventory to load vars for managed_node2 30583 1726853670.28473: Calling groups_inventory to load vars for managed_node2 30583 1726853670.28478: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853670.28490: Calling all_plugins_play to load vars for managed_node2 30583 1726853670.28493: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853670.28496: Calling groups_plugins_play to load vars for managed_node2 30583 1726853670.28985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853670.29215: done with get_vars() 30583 1726853670.29232: done getting variables 30583 1726853670.29304: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:34:30 -0400 (0:00:00.035) 0:00:05.630 ****** 30583 1726853670.29332: entering _queue_task() for managed_node2/debug 30583 1726853670.29626: worker is 1 (out of 1 available) 30583 1726853670.29640: exiting _queue_task() for managed_node2/debug 30583 1726853670.29652: done queuing things up, now waiting for results queue to drain 30583 1726853670.29653: waiting for pending results... 30583 1726853670.30003: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 30583 1726853670.30077: in run() - task 02083763-bbaf-05ea-abc5-0000000000bb 30583 1726853670.30080: variable 'ansible_search_path' from source: unknown 30583 1726853670.30083: variable 'ansible_search_path' from source: unknown 30583 1726853670.30121: calling self._execute() 30583 1726853670.30212: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.30224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.30276: variable 'omit' from source: magic vars 30583 1726853670.30619: variable 'ansible_distribution_major_version' from source: facts 30583 1726853670.30662: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853670.30676: variable 'omit' from source: magic vars 30583 1726853670.30723: variable 'omit' from source: magic vars 30583 1726853670.30868: variable 'current_interfaces' from source: set_fact 30583 1726853670.30873: variable 'omit' from source: magic vars 30583 1726853670.30915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853670.30992: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853670.31020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853670.31081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853670.31085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853670.31095: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853670.31106: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.31114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.31238: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853670.31250: Set connection var ansible_timeout to 10 30583 1726853670.31261: Set connection var ansible_connection to ssh 30583 1726853670.31273: Set connection var ansible_shell_executable to /bin/sh 30583 1726853670.31311: Set connection var ansible_shell_type to sh 30583 1726853670.31403: Set connection var ansible_pipelining to False 30583 1726853670.31410: variable 'ansible_shell_executable' from source: unknown 30583 1726853670.31413: variable 'ansible_connection' from source: unknown 30583 1726853670.31415: variable 'ansible_module_compression' from source: unknown 30583 1726853670.31417: variable 'ansible_shell_type' from source: unknown 30583 1726853670.31418: variable 'ansible_shell_executable' from source: unknown 30583 1726853670.31420: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.31421: variable 'ansible_pipelining' from source: unknown 30583 1726853670.31423: variable 'ansible_timeout' from source: unknown 30583 1726853670.31424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.31590: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853670.31605: variable 'omit' from source: magic vars 30583 1726853670.31618: starting attempt loop 30583 1726853670.31628: running the handler 30583 1726853670.31684: handler run complete 30583 1726853670.31703: attempt loop complete, returning result 30583 1726853670.31711: _execute() done 30583 1726853670.31718: dumping result to json 30583 1726853670.31735: done dumping result, returning 30583 1726853670.31777: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [02083763-bbaf-05ea-abc5-0000000000bb] 30583 1726853670.31841: sending task result for task 02083763-bbaf-05ea-abc5-0000000000bb 30583 1726853670.31927: done sending task result for task 02083763-bbaf-05ea-abc5-0000000000bb 30583 1726853670.31930: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30583 1726853670.31991: no more pending results, returning what we have 30583 1726853670.31994: results queue empty 30583 1726853670.31995: checking for any_errors_fatal 30583 1726853670.32002: done checking for any_errors_fatal 30583 1726853670.32003: checking for max_fail_percentage 30583 1726853670.32005: done checking for max_fail_percentage 30583 1726853670.32006: checking to see if all hosts have failed and the running result is not ok 30583 1726853670.32006: done checking to see if all hosts have failed 30583 1726853670.32007: getting the remaining hosts for this loop 30583 1726853670.32009: done getting the remaining hosts for this loop 30583 1726853670.32013: getting the next task for host managed_node2 30583 1726853670.32022: done getting next task for host managed_node2 30583 1726853670.32025: ^ task is: TASK: Setup 30583 1726853670.32028: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853670.32032: getting variables 30583 1726853670.32034: in VariableManager get_vars() 30583 1726853670.32273: Calling all_inventory to load vars for managed_node2 30583 1726853670.32276: Calling groups_inventory to load vars for managed_node2 30583 1726853670.32279: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853670.32288: Calling all_plugins_play to load vars for managed_node2 30583 1726853670.32291: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853670.32294: Calling groups_plugins_play to load vars for managed_node2 30583 1726853670.32557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853670.32763: done with get_vars() 30583 1726853670.32775: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 13:34:30 -0400 (0:00:00.035) 0:00:05.665 ****** 30583 1726853670.32874: entering _queue_task() for managed_node2/include_tasks 30583 1726853670.33315: worker is 1 (out of 1 available) 30583 1726853670.33328: exiting _queue_task() for managed_node2/include_tasks 30583 1726853670.33340: done queuing things up, now waiting for results queue to drain 30583 1726853670.33578: waiting for pending results... 30583 1726853670.33781: running TaskExecutor() for managed_node2/TASK: Setup 30583 1726853670.33787: in run() - task 02083763-bbaf-05ea-abc5-000000000094 30583 1726853670.33796: variable 'ansible_search_path' from source: unknown 30583 1726853670.33799: variable 'ansible_search_path' from source: unknown 30583 1726853670.33810: variable 'lsr_setup' from source: include params 30583 1726853670.34056: variable 'lsr_setup' from source: include params 30583 1726853670.34109: variable 'omit' from source: magic vars 30583 1726853670.34200: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.34207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.34214: variable 'omit' from source: magic vars 30583 1726853670.34437: variable 'ansible_distribution_major_version' from source: facts 30583 1726853670.34444: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853670.34450: variable 'item' from source: unknown 30583 1726853670.34506: variable 'item' from source: unknown 30583 1726853670.34528: variable 'item' from source: unknown 30583 1726853670.34570: variable 'item' from source: unknown 30583 1726853670.34696: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.34699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.34702: variable 'omit' from source: magic vars 30583 1726853670.34775: variable 'ansible_distribution_major_version' from source: facts 30583 1726853670.34779: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853670.34785: variable 'item' from source: unknown 30583 1726853670.34829: variable 'item' from source: unknown 30583 1726853670.34848: variable 'item' from source: unknown 30583 1726853670.34894: variable 'item' from source: unknown 30583 1726853670.34957: dumping result to json 30583 1726853670.34960: done dumping result, returning 30583 1726853670.34962: done running TaskExecutor() for managed_node2/TASK: Setup [02083763-bbaf-05ea-abc5-000000000094] 30583 1726853670.34963: sending task result for task 02083763-bbaf-05ea-abc5-000000000094 30583 1726853670.35001: done sending task result for task 02083763-bbaf-05ea-abc5-000000000094 30583 1726853670.35017: WORKER PROCESS EXITING 30583 1726853670.35042: no more pending results, returning what we have 30583 1726853670.35046: in VariableManager get_vars() 30583 1726853670.35087: Calling all_inventory to load vars for managed_node2 30583 1726853670.35090: Calling groups_inventory to load vars for managed_node2 30583 1726853670.35093: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853670.35105: Calling all_plugins_play to load vars for managed_node2 30583 1726853670.35108: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853670.35110: Calling groups_plugins_play to load vars for managed_node2 30583 1726853670.35375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853670.35566: done with get_vars() 30583 1726853670.35585: variable 'ansible_search_path' from source: unknown 30583 1726853670.35586: variable 'ansible_search_path' from source: unknown 30583 1726853670.35622: variable 'ansible_search_path' from source: unknown 30583 1726853670.35624: variable 'ansible_search_path' from source: unknown 30583 1726853670.35649: we have included files to process 30583 1726853670.35650: generating all_blocks data 30583 1726853670.35652: done generating all_blocks data 30583 1726853670.35658: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30583 1726853670.35659: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30583 1726853670.35661: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30583 1726853670.35883: done processing included file 30583 1726853670.35886: iterating over new_blocks loaded from include file 30583 1726853670.35887: in VariableManager get_vars() 30583 1726853670.35911: done with get_vars() 30583 1726853670.35913: filtering new block on tags 30583 1726853670.35937: done filtering new block on tags 30583 1726853670.35939: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 => (item=tasks/delete_interface.yml) 30583 1726853670.35943: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853670.35944: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853670.35947: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853670.36140: in VariableManager get_vars() 30583 1726853670.36180: done with get_vars() 30583 1726853670.36327: done processing included file 30583 1726853670.36329: iterating over new_blocks loaded from include file 30583 1726853670.36331: in VariableManager get_vars() 30583 1726853670.36369: done with get_vars() 30583 1726853670.36373: filtering new block on tags 30583 1726853670.36405: done filtering new block on tags 30583 1726853670.36407: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item=tasks/assert_device_absent.yml) 30583 1726853670.36409: extending task lists for all hosts with included blocks 30583 1726853670.37252: done extending task lists 30583 1726853670.37254: done processing included files 30583 1726853670.37254: results queue empty 30583 1726853670.37255: checking for any_errors_fatal 30583 1726853670.37258: done checking for any_errors_fatal 30583 1726853670.37258: checking for max_fail_percentage 30583 1726853670.37260: done checking for max_fail_percentage 30583 1726853670.37260: checking to see if all hosts have failed and the running result is not ok 30583 1726853670.37261: done checking to see if all hosts have failed 30583 1726853670.37262: getting the remaining hosts for this loop 30583 1726853670.37263: done getting the remaining hosts for this loop 30583 1726853670.37266: getting the next task for host managed_node2 30583 1726853670.37270: done getting next task for host managed_node2 30583 1726853670.37477: ^ task is: TASK: Remove test interface if necessary 30583 1726853670.37481: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853670.37484: getting variables 30583 1726853670.37485: in VariableManager get_vars() 30583 1726853670.37498: Calling all_inventory to load vars for managed_node2 30583 1726853670.37500: Calling groups_inventory to load vars for managed_node2 30583 1726853670.37502: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853670.37508: Calling all_plugins_play to load vars for managed_node2 30583 1726853670.37510: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853670.37512: Calling groups_plugins_play to load vars for managed_node2 30583 1726853670.37646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853670.37858: done with get_vars() 30583 1726853670.37867: done getting variables 30583 1726853670.37911: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 13:34:30 -0400 (0:00:00.050) 0:00:05.716 ****** 30583 1726853670.37953: entering _queue_task() for managed_node2/command 30583 1726853670.38247: worker is 1 (out of 1 available) 30583 1726853670.38259: exiting _queue_task() for managed_node2/command 30583 1726853670.38275: done queuing things up, now waiting for results queue to drain 30583 1726853670.38276: waiting for pending results... 30583 1726853670.38989: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 30583 1726853670.38995: in run() - task 02083763-bbaf-05ea-abc5-00000000011b 30583 1726853670.38998: variable 'ansible_search_path' from source: unknown 30583 1726853670.39001: variable 'ansible_search_path' from source: unknown 30583 1726853670.39004: calling self._execute() 30583 1726853670.39134: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.39206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.39221: variable 'omit' from source: magic vars 30583 1726853670.39822: variable 'ansible_distribution_major_version' from source: facts 30583 1726853670.39840: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853670.39855: variable 'omit' from source: magic vars 30583 1726853670.39958: variable 'omit' from source: magic vars 30583 1726853670.40087: variable 'interface' from source: play vars 30583 1726853670.40111: variable 'omit' from source: magic vars 30583 1726853670.40156: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853670.40204: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853670.40229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853670.40254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853670.40274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853670.40307: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853670.40314: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.40319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.40417: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853670.40429: Set connection var ansible_timeout to 10 30583 1726853670.40436: Set connection var ansible_connection to ssh 30583 1726853670.40445: Set connection var ansible_shell_executable to /bin/sh 30583 1726853670.40452: Set connection var ansible_shell_type to sh 30583 1726853670.40465: Set connection var ansible_pipelining to False 30583 1726853670.40497: variable 'ansible_shell_executable' from source: unknown 30583 1726853670.40508: variable 'ansible_connection' from source: unknown 30583 1726853670.40516: variable 'ansible_module_compression' from source: unknown 30583 1726853670.40576: variable 'ansible_shell_type' from source: unknown 30583 1726853670.40579: variable 'ansible_shell_executable' from source: unknown 30583 1726853670.40581: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.40583: variable 'ansible_pipelining' from source: unknown 30583 1726853670.40585: variable 'ansible_timeout' from source: unknown 30583 1726853670.40587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.40686: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853670.40702: variable 'omit' from source: magic vars 30583 1726853670.40710: starting attempt loop 30583 1726853670.40717: running the handler 30583 1726853670.40736: _low_level_execute_command(): starting 30583 1726853670.40746: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853670.41412: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853670.41493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.41536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853670.41559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853670.41585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853670.41714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853670.43466: stdout chunk (state=3): >>>/root <<< 30583 1726853670.43564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853670.43609: stderr chunk (state=3): >>><<< 30583 1726853670.43612: stdout chunk (state=3): >>><<< 30583 1726853670.43631: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853670.43662: _low_level_execute_command(): starting 30583 1726853670.43666: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679 `" && echo ansible-tmp-1726853670.4363868-30849-173189785267679="` echo /root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679 `" ) && sleep 0' 30583 1726853670.44698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853670.44722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853670.44738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853670.44842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853670.46887: stdout chunk (state=3): >>>ansible-tmp-1726853670.4363868-30849-173189785267679=/root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679 <<< 30583 1726853670.47038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853670.47049: stdout chunk (state=3): >>><<< 30583 1726853670.47060: stderr chunk (state=3): >>><<< 30583 1726853670.47090: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853670.4363868-30849-173189785267679=/root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853670.47141: variable 'ansible_module_compression' from source: unknown 30583 1726853670.47200: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853670.47250: variable 'ansible_facts' from source: unknown 30583 1726853670.47342: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679/AnsiballZ_command.py 30583 1726853670.47580: Sending initial data 30583 1726853670.47589: Sent initial data (156 bytes) 30583 1726853670.48143: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853670.48161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853670.48176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853670.48289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853670.49980: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853670.50052: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853670.50148: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp2h99itvd /root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679/AnsiballZ_command.py <<< 30583 1726853670.50151: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679/AnsiballZ_command.py" <<< 30583 1726853670.50242: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp2h99itvd" to remote "/root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679/AnsiballZ_command.py" <<< 30583 1726853670.51100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853670.51104: stdout chunk (state=3): >>><<< 30583 1726853670.51258: stderr chunk (state=3): >>><<< 30583 1726853670.51262: done transferring module to remote 30583 1726853670.51265: _low_level_execute_command(): starting 30583 1726853670.51268: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679/ /root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679/AnsiballZ_command.py && sleep 0' 30583 1726853670.51682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853670.51686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853670.51688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.51692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853670.51694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853670.51696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.51742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853670.51747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853670.51830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853670.53777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853670.53790: stderr chunk (state=3): >>><<< 30583 1726853670.53793: stdout chunk (state=3): >>><<< 30583 1726853670.53810: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853670.53813: _low_level_execute_command(): starting 30583 1726853670.53819: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679/AnsiballZ_command.py && sleep 0' 30583 1726853670.54354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853670.54363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853670.54392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853670.54395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.54398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853670.54400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.54448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853670.54451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853670.54539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853670.71201: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 13:34:30.703095", "end": "2024-09-20 13:34:30.710796", "delta": "0:00:00.007701", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853670.72860: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. <<< 30583 1726853670.72864: stdout chunk (state=3): >>><<< 30583 1726853670.72867: stderr chunk (state=3): >>><<< 30583 1726853670.73078: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 13:34:30.703095", "end": "2024-09-20 13:34:30.710796", "delta": "0:00:00.007701", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. 30583 1726853670.73082: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853670.73085: _low_level_execute_command(): starting 30583 1726853670.73088: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853670.4363868-30849-173189785267679/ > /dev/null 2>&1 && sleep 0' 30583 1726853670.73707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853670.73717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853670.73727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853670.73742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853670.73764: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853670.73773: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853670.73785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.73800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853670.73808: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853670.73815: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853670.73823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853670.73833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853670.73896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.73989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853670.73998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853670.74098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853670.76018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853670.76051: stderr chunk (state=3): >>><<< 30583 1726853670.76054: stdout chunk (state=3): >>><<< 30583 1726853670.76068: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853670.76075: handler run complete 30583 1726853670.76092: Evaluated conditional (False): False 30583 1726853670.76100: attempt loop complete, returning result 30583 1726853670.76103: _execute() done 30583 1726853670.76105: dumping result to json 30583 1726853670.76110: done dumping result, returning 30583 1726853670.76122: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [02083763-bbaf-05ea-abc5-00000000011b] 30583 1726853670.76126: sending task result for task 02083763-bbaf-05ea-abc5-00000000011b 30583 1726853670.76218: done sending task result for task 02083763-bbaf-05ea-abc5-00000000011b 30583 1726853670.76221: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.007701", "end": "2024-09-20 13:34:30.710796", "rc": 1, "start": "2024-09-20 13:34:30.703095" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30583 1726853670.76300: no more pending results, returning what we have 30583 1726853670.76305: results queue empty 30583 1726853670.76306: checking for any_errors_fatal 30583 1726853670.76307: done checking for any_errors_fatal 30583 1726853670.76308: checking for max_fail_percentage 30583 1726853670.76310: done checking for max_fail_percentage 30583 1726853670.76311: checking to see if all hosts have failed and the running result is not ok 30583 1726853670.76312: done checking to see if all hosts have failed 30583 1726853670.76312: getting the remaining hosts for this loop 30583 1726853670.76314: done getting the remaining hosts for this loop 30583 1726853670.76318: getting the next task for host managed_node2 30583 1726853670.76327: done getting next task for host managed_node2 30583 1726853670.76330: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30583 1726853670.76334: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853670.76337: getting variables 30583 1726853670.76339: in VariableManager get_vars() 30583 1726853670.76367: Calling all_inventory to load vars for managed_node2 30583 1726853670.76369: Calling groups_inventory to load vars for managed_node2 30583 1726853670.76373: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853670.76391: Calling all_plugins_play to load vars for managed_node2 30583 1726853670.76394: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853670.76397: Calling groups_plugins_play to load vars for managed_node2 30583 1726853670.76597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853670.76809: done with get_vars() 30583 1726853670.76828: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:34:30 -0400 (0:00:00.389) 0:00:06.106 ****** 30583 1726853670.76925: entering _queue_task() for managed_node2/include_tasks 30583 1726853670.77301: worker is 1 (out of 1 available) 30583 1726853670.77313: exiting _queue_task() for managed_node2/include_tasks 30583 1726853670.77323: done queuing things up, now waiting for results queue to drain 30583 1726853670.77325: waiting for pending results... 30583 1726853670.77503: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30583 1726853670.77593: in run() - task 02083763-bbaf-05ea-abc5-00000000011f 30583 1726853670.77603: variable 'ansible_search_path' from source: unknown 30583 1726853670.77607: variable 'ansible_search_path' from source: unknown 30583 1726853670.77635: calling self._execute() 30583 1726853670.77705: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.77709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.77716: variable 'omit' from source: magic vars 30583 1726853670.77966: variable 'ansible_distribution_major_version' from source: facts 30583 1726853670.77976: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853670.77983: _execute() done 30583 1726853670.77986: dumping result to json 30583 1726853670.77990: done dumping result, returning 30583 1726853670.78001: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-05ea-abc5-00000000011f] 30583 1726853670.78004: sending task result for task 02083763-bbaf-05ea-abc5-00000000011f 30583 1726853670.78081: done sending task result for task 02083763-bbaf-05ea-abc5-00000000011f 30583 1726853670.78084: WORKER PROCESS EXITING 30583 1726853670.78124: no more pending results, returning what we have 30583 1726853670.78128: in VariableManager get_vars() 30583 1726853670.78154: Calling all_inventory to load vars for managed_node2 30583 1726853670.78159: Calling groups_inventory to load vars for managed_node2 30583 1726853670.78162: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853670.78173: Calling all_plugins_play to load vars for managed_node2 30583 1726853670.78175: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853670.78178: Calling groups_plugins_play to load vars for managed_node2 30583 1726853670.78290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853670.78399: done with get_vars() 30583 1726853670.78405: variable 'ansible_search_path' from source: unknown 30583 1726853670.78406: variable 'ansible_search_path' from source: unknown 30583 1726853670.78411: variable 'item' from source: include params 30583 1726853670.78492: variable 'item' from source: include params 30583 1726853670.78517: we have included files to process 30583 1726853670.78518: generating all_blocks data 30583 1726853670.78519: done generating all_blocks data 30583 1726853670.78523: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853670.78523: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853670.78525: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853670.78679: done processing included file 30583 1726853670.78681: iterating over new_blocks loaded from include file 30583 1726853670.78682: in VariableManager get_vars() 30583 1726853670.78691: done with get_vars() 30583 1726853670.78692: filtering new block on tags 30583 1726853670.78707: done filtering new block on tags 30583 1726853670.78708: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30583 1726853670.78711: extending task lists for all hosts with included blocks 30583 1726853670.78805: done extending task lists 30583 1726853670.78806: done processing included files 30583 1726853670.78806: results queue empty 30583 1726853670.78807: checking for any_errors_fatal 30583 1726853670.78809: done checking for any_errors_fatal 30583 1726853670.78810: checking for max_fail_percentage 30583 1726853670.78810: done checking for max_fail_percentage 30583 1726853670.78811: checking to see if all hosts have failed and the running result is not ok 30583 1726853670.78811: done checking to see if all hosts have failed 30583 1726853670.78812: getting the remaining hosts for this loop 30583 1726853670.78813: done getting the remaining hosts for this loop 30583 1726853670.78814: getting the next task for host managed_node2 30583 1726853670.78817: done getting next task for host managed_node2 30583 1726853670.78818: ^ task is: TASK: Get stat for interface {{ interface }} 30583 1726853670.78820: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853670.78822: getting variables 30583 1726853670.78822: in VariableManager get_vars() 30583 1726853670.78827: Calling all_inventory to load vars for managed_node2 30583 1726853670.78829: Calling groups_inventory to load vars for managed_node2 30583 1726853670.78830: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853670.78833: Calling all_plugins_play to load vars for managed_node2 30583 1726853670.78834: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853670.78836: Calling groups_plugins_play to load vars for managed_node2 30583 1726853670.78936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853670.79042: done with get_vars() 30583 1726853670.79048: done getting variables 30583 1726853670.79126: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:34:30 -0400 (0:00:00.022) 0:00:06.128 ****** 30583 1726853670.79144: entering _queue_task() for managed_node2/stat 30583 1726853670.79309: worker is 1 (out of 1 available) 30583 1726853670.79321: exiting _queue_task() for managed_node2/stat 30583 1726853670.79331: done queuing things up, now waiting for results queue to drain 30583 1726853670.79333: waiting for pending results... 30583 1726853670.79486: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30583 1726853670.79566: in run() - task 02083763-bbaf-05ea-abc5-00000000016e 30583 1726853670.79578: variable 'ansible_search_path' from source: unknown 30583 1726853670.79582: variable 'ansible_search_path' from source: unknown 30583 1726853670.79676: calling self._execute() 30583 1726853670.79696: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.79707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.79720: variable 'omit' from source: magic vars 30583 1726853670.80049: variable 'ansible_distribution_major_version' from source: facts 30583 1726853670.80069: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853670.80083: variable 'omit' from source: magic vars 30583 1726853670.80136: variable 'omit' from source: magic vars 30583 1726853670.80269: variable 'interface' from source: play vars 30583 1726853670.80297: variable 'omit' from source: magic vars 30583 1726853670.80343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853670.80392: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853670.80478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853670.80486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853670.80488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853670.80490: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853670.80492: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.80586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.80606: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853670.80618: Set connection var ansible_timeout to 10 30583 1726853670.80624: Set connection var ansible_connection to ssh 30583 1726853670.80634: Set connection var ansible_shell_executable to /bin/sh 30583 1726853670.80676: Set connection var ansible_shell_type to sh 30583 1726853670.80679: Set connection var ansible_pipelining to False 30583 1726853670.80682: variable 'ansible_shell_executable' from source: unknown 30583 1726853670.80705: variable 'ansible_connection' from source: unknown 30583 1726853670.80730: variable 'ansible_module_compression' from source: unknown 30583 1726853670.80745: variable 'ansible_shell_type' from source: unknown 30583 1726853670.80754: variable 'ansible_shell_executable' from source: unknown 30583 1726853670.80791: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853670.80794: variable 'ansible_pipelining' from source: unknown 30583 1726853670.80796: variable 'ansible_timeout' from source: unknown 30583 1726853670.80802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853670.81008: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853670.81017: variable 'omit' from source: magic vars 30583 1726853670.81025: starting attempt loop 30583 1726853670.81028: running the handler 30583 1726853670.81040: _low_level_execute_command(): starting 30583 1726853670.81045: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853670.81534: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853670.81538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.81543: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853670.81545: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.81590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853670.81604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853670.81679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853670.83422: stdout chunk (state=3): >>>/root <<< 30583 1726853670.83583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853670.83586: stdout chunk (state=3): >>><<< 30583 1726853670.83589: stderr chunk (state=3): >>><<< 30583 1726853670.83609: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853670.83628: _low_level_execute_command(): starting 30583 1726853670.83640: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220 `" && echo ansible-tmp-1726853670.8361552-30875-162364252420220="` echo /root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220 `" ) && sleep 0' 30583 1726853670.84302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853670.84306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853670.84309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853670.84332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.84346: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.84381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853670.84384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853670.84463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853670.86439: stdout chunk (state=3): >>>ansible-tmp-1726853670.8361552-30875-162364252420220=/root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220 <<< 30583 1726853670.86677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853670.86681: stdout chunk (state=3): >>><<< 30583 1726853670.86683: stderr chunk (state=3): >>><<< 30583 1726853670.86686: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853670.8361552-30875-162364252420220=/root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853670.86689: variable 'ansible_module_compression' from source: unknown 30583 1726853670.86744: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853670.86788: variable 'ansible_facts' from source: unknown 30583 1726853670.86857: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220/AnsiballZ_stat.py 30583 1726853670.86962: Sending initial data 30583 1726853670.86965: Sent initial data (153 bytes) 30583 1726853670.87402: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853670.87405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853670.87408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.87410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853670.87412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.87463: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853670.87468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853670.87536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853670.89206: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853670.89210: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853670.89282: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853670.89348: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpq2ce70h1 /root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220/AnsiballZ_stat.py <<< 30583 1726853670.89356: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220/AnsiballZ_stat.py" <<< 30583 1726853670.89422: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpq2ce70h1" to remote "/root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220/AnsiballZ_stat.py" <<< 30583 1726853670.89425: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220/AnsiballZ_stat.py" <<< 30583 1726853670.90087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853670.90130: stderr chunk (state=3): >>><<< 30583 1726853670.90133: stdout chunk (state=3): >>><<< 30583 1726853670.90154: done transferring module to remote 30583 1726853670.90164: _low_level_execute_command(): starting 30583 1726853670.90170: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220/ /root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220/AnsiballZ_stat.py && sleep 0' 30583 1726853670.90618: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853670.90621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853670.90623: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.90625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853670.90627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.90675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853670.90679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853670.90751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853670.92677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853670.92703: stderr chunk (state=3): >>><<< 30583 1726853670.92706: stdout chunk (state=3): >>><<< 30583 1726853670.92720: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853670.92723: _low_level_execute_command(): starting 30583 1726853670.92727: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220/AnsiballZ_stat.py && sleep 0' 30583 1726853670.93153: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853670.93160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853670.93191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.93194: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853670.93196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853670.93198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853670.93254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853670.93262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853670.93264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853670.93339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853671.09115: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853671.10679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853671.10683: stdout chunk (state=3): >>><<< 30583 1726853671.10686: stderr chunk (state=3): >>><<< 30583 1726853671.10688: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853671.10692: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853671.10694: _low_level_execute_command(): starting 30583 1726853671.10696: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853670.8361552-30875-162364252420220/ > /dev/null 2>&1 && sleep 0' 30583 1726853671.11276: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853671.11301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853671.11304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853671.11314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853671.11339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853671.11342: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853671.11352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853671.11387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853671.11576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853671.11580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853671.11582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853671.11687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853671.13591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853671.13595: stdout chunk (state=3): >>><<< 30583 1726853671.13600: stderr chunk (state=3): >>><<< 30583 1726853671.13684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853671.13688: handler run complete 30583 1726853671.13690: attempt loop complete, returning result 30583 1726853671.13691: _execute() done 30583 1726853671.13693: dumping result to json 30583 1726853671.13695: done dumping result, returning 30583 1726853671.13696: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [02083763-bbaf-05ea-abc5-00000000016e] 30583 1726853671.13698: sending task result for task 02083763-bbaf-05ea-abc5-00000000016e 30583 1726853671.13761: done sending task result for task 02083763-bbaf-05ea-abc5-00000000016e 30583 1726853671.13764: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30583 1726853671.13837: no more pending results, returning what we have 30583 1726853671.13840: results queue empty 30583 1726853671.13841: checking for any_errors_fatal 30583 1726853671.13842: done checking for any_errors_fatal 30583 1726853671.13843: checking for max_fail_percentage 30583 1726853671.13845: done checking for max_fail_percentage 30583 1726853671.13846: checking to see if all hosts have failed and the running result is not ok 30583 1726853671.13846: done checking to see if all hosts have failed 30583 1726853671.13847: getting the remaining hosts for this loop 30583 1726853671.13849: done getting the remaining hosts for this loop 30583 1726853671.13852: getting the next task for host managed_node2 30583 1726853671.13860: done getting next task for host managed_node2 30583 1726853671.13863: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30583 1726853671.13866: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853671.13870: getting variables 30583 1726853671.13873: in VariableManager get_vars() 30583 1726853671.13898: Calling all_inventory to load vars for managed_node2 30583 1726853671.13900: Calling groups_inventory to load vars for managed_node2 30583 1726853671.13903: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853671.13912: Calling all_plugins_play to load vars for managed_node2 30583 1726853671.13920: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853671.13923: Calling groups_plugins_play to load vars for managed_node2 30583 1726853671.14109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853671.14339: done with get_vars() 30583 1726853671.14348: done getting variables 30583 1726853671.14443: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 30583 1726853671.14581: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:34:31 -0400 (0:00:00.354) 0:00:06.483 ****** 30583 1726853671.14615: entering _queue_task() for managed_node2/assert 30583 1726853671.14617: Creating lock for assert 30583 1726853671.14921: worker is 1 (out of 1 available) 30583 1726853671.14935: exiting _queue_task() for managed_node2/assert 30583 1726853671.14946: done queuing things up, now waiting for results queue to drain 30583 1726853671.14947: waiting for pending results... 30583 1726853671.15531: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 30583 1726853671.15537: in run() - task 02083763-bbaf-05ea-abc5-000000000120 30583 1726853671.15540: variable 'ansible_search_path' from source: unknown 30583 1726853671.15630: variable 'ansible_search_path' from source: unknown 30583 1726853671.15633: calling self._execute() 30583 1726853671.15636: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853671.15639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853671.15677: variable 'omit' from source: magic vars 30583 1726853671.16065: variable 'ansible_distribution_major_version' from source: facts 30583 1726853671.16088: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853671.16178: variable 'omit' from source: magic vars 30583 1726853671.16181: variable 'omit' from source: magic vars 30583 1726853671.16261: variable 'interface' from source: play vars 30583 1726853671.16307: variable 'omit' from source: magic vars 30583 1726853671.16366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853671.16396: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853671.16416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853671.16428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853671.16437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853671.16462: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853671.16465: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853671.16468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853671.16550: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853671.16556: Set connection var ansible_timeout to 10 30583 1726853671.16561: Set connection var ansible_connection to ssh 30583 1726853671.16566: Set connection var ansible_shell_executable to /bin/sh 30583 1726853671.16569: Set connection var ansible_shell_type to sh 30583 1726853671.16578: Set connection var ansible_pipelining to False 30583 1726853671.16596: variable 'ansible_shell_executable' from source: unknown 30583 1726853671.16600: variable 'ansible_connection' from source: unknown 30583 1726853671.16604: variable 'ansible_module_compression' from source: unknown 30583 1726853671.16607: variable 'ansible_shell_type' from source: unknown 30583 1726853671.16609: variable 'ansible_shell_executable' from source: unknown 30583 1726853671.16611: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853671.16613: variable 'ansible_pipelining' from source: unknown 30583 1726853671.16615: variable 'ansible_timeout' from source: unknown 30583 1726853671.16617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853671.16719: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853671.16727: variable 'omit' from source: magic vars 30583 1726853671.16740: starting attempt loop 30583 1726853671.16743: running the handler 30583 1726853671.16835: variable 'interface_stat' from source: set_fact 30583 1726853671.16846: Evaluated conditional (not interface_stat.stat.exists): True 30583 1726853671.16853: handler run complete 30583 1726853671.16864: attempt loop complete, returning result 30583 1726853671.16867: _execute() done 30583 1726853671.16869: dumping result to json 30583 1726853671.16873: done dumping result, returning 30583 1726853671.16879: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [02083763-bbaf-05ea-abc5-000000000120] 30583 1726853671.16883: sending task result for task 02083763-bbaf-05ea-abc5-000000000120 30583 1726853671.16963: done sending task result for task 02083763-bbaf-05ea-abc5-000000000120 30583 1726853671.16965: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853671.17015: no more pending results, returning what we have 30583 1726853671.17019: results queue empty 30583 1726853671.17020: checking for any_errors_fatal 30583 1726853671.17030: done checking for any_errors_fatal 30583 1726853671.17032: checking for max_fail_percentage 30583 1726853671.17034: done checking for max_fail_percentage 30583 1726853671.17035: checking to see if all hosts have failed and the running result is not ok 30583 1726853671.17035: done checking to see if all hosts have failed 30583 1726853671.17036: getting the remaining hosts for this loop 30583 1726853671.17039: done getting the remaining hosts for this loop 30583 1726853671.17042: getting the next task for host managed_node2 30583 1726853671.17051: done getting next task for host managed_node2 30583 1726853671.17054: ^ task is: TASK: Test 30583 1726853671.17058: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853671.17061: getting variables 30583 1726853671.17063: in VariableManager get_vars() 30583 1726853671.17094: Calling all_inventory to load vars for managed_node2 30583 1726853671.17097: Calling groups_inventory to load vars for managed_node2 30583 1726853671.17100: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853671.17108: Calling all_plugins_play to load vars for managed_node2 30583 1726853671.17110: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853671.17113: Calling groups_plugins_play to load vars for managed_node2 30583 1726853671.17233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853671.17399: done with get_vars() 30583 1726853671.17409: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 13:34:31 -0400 (0:00:00.028) 0:00:06.512 ****** 30583 1726853671.17501: entering _queue_task() for managed_node2/include_tasks 30583 1726853671.17760: worker is 1 (out of 1 available) 30583 1726853671.17977: exiting _queue_task() for managed_node2/include_tasks 30583 1726853671.17987: done queuing things up, now waiting for results queue to drain 30583 1726853671.17989: waiting for pending results... 30583 1726853671.18115: running TaskExecutor() for managed_node2/TASK: Test 30583 1726853671.18152: in run() - task 02083763-bbaf-05ea-abc5-000000000095 30583 1726853671.18182: variable 'ansible_search_path' from source: unknown 30583 1726853671.18194: variable 'ansible_search_path' from source: unknown 30583 1726853671.18246: variable 'lsr_test' from source: include params 30583 1726853671.18477: variable 'lsr_test' from source: include params 30583 1726853671.18528: variable 'omit' from source: magic vars 30583 1726853671.18677: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853671.18757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853671.18761: variable 'omit' from source: magic vars 30583 1726853671.18924: variable 'ansible_distribution_major_version' from source: facts 30583 1726853671.18946: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853671.18961: variable 'item' from source: unknown 30583 1726853671.19030: variable 'item' from source: unknown 30583 1726853671.19068: variable 'item' from source: unknown 30583 1726853671.19138: variable 'item' from source: unknown 30583 1726853671.19579: dumping result to json 30583 1726853671.19583: done dumping result, returning 30583 1726853671.19585: done running TaskExecutor() for managed_node2/TASK: Test [02083763-bbaf-05ea-abc5-000000000095] 30583 1726853671.19588: sending task result for task 02083763-bbaf-05ea-abc5-000000000095 30583 1726853671.19629: done sending task result for task 02083763-bbaf-05ea-abc5-000000000095 30583 1726853671.19678: WORKER PROCESS EXITING 30583 1726853671.19805: no more pending results, returning what we have 30583 1726853671.19809: in VariableManager get_vars() 30583 1726853671.19837: Calling all_inventory to load vars for managed_node2 30583 1726853671.19840: Calling groups_inventory to load vars for managed_node2 30583 1726853671.19843: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853671.19852: Calling all_plugins_play to load vars for managed_node2 30583 1726853671.19854: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853671.19858: Calling groups_plugins_play to load vars for managed_node2 30583 1726853671.20494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853671.20748: done with get_vars() 30583 1726853671.20753: variable 'ansible_search_path' from source: unknown 30583 1726853671.20754: variable 'ansible_search_path' from source: unknown 30583 1726853671.20781: we have included files to process 30583 1726853671.20782: generating all_blocks data 30583 1726853671.20783: done generating all_blocks data 30583 1726853671.20785: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853671.20785: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853671.20787: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853671.20995: done processing included file 30583 1726853671.20997: iterating over new_blocks loaded from include file 30583 1726853671.20998: in VariableManager get_vars() 30583 1726853671.21007: done with get_vars() 30583 1726853671.21008: filtering new block on tags 30583 1726853671.21027: done filtering new block on tags 30583 1726853671.21028: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 30583 1726853671.21032: extending task lists for all hosts with included blocks 30583 1726853671.21500: done extending task lists 30583 1726853671.21502: done processing included files 30583 1726853671.21502: results queue empty 30583 1726853671.21503: checking for any_errors_fatal 30583 1726853671.21504: done checking for any_errors_fatal 30583 1726853671.21505: checking for max_fail_percentage 30583 1726853671.21506: done checking for max_fail_percentage 30583 1726853671.21506: checking to see if all hosts have failed and the running result is not ok 30583 1726853671.21506: done checking to see if all hosts have failed 30583 1726853671.21507: getting the remaining hosts for this loop 30583 1726853671.21508: done getting the remaining hosts for this loop 30583 1726853671.21509: getting the next task for host managed_node2 30583 1726853671.21512: done getting next task for host managed_node2 30583 1726853671.21513: ^ task is: TASK: Include network role 30583 1726853671.21515: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853671.21517: getting variables 30583 1726853671.21517: in VariableManager get_vars() 30583 1726853671.21523: Calling all_inventory to load vars for managed_node2 30583 1726853671.21524: Calling groups_inventory to load vars for managed_node2 30583 1726853671.21525: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853671.21529: Calling all_plugins_play to load vars for managed_node2 30583 1726853671.21531: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853671.21532: Calling groups_plugins_play to load vars for managed_node2 30583 1726853671.21628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853671.21732: done with get_vars() 30583 1726853671.21738: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 13:34:31 -0400 (0:00:00.042) 0:00:06.555 ****** 30583 1726853671.21787: entering _queue_task() for managed_node2/include_role 30583 1726853671.21788: Creating lock for include_role 30583 1726853671.22003: worker is 1 (out of 1 available) 30583 1726853671.22018: exiting _queue_task() for managed_node2/include_role 30583 1726853671.22030: done queuing things up, now waiting for results queue to drain 30583 1726853671.22031: waiting for pending results... 30583 1726853671.22220: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853671.22323: in run() - task 02083763-bbaf-05ea-abc5-00000000018e 30583 1726853671.22334: variable 'ansible_search_path' from source: unknown 30583 1726853671.22337: variable 'ansible_search_path' from source: unknown 30583 1726853671.22412: calling self._execute() 30583 1726853671.22627: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853671.22634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853671.22637: variable 'omit' from source: magic vars 30583 1726853671.23250: variable 'ansible_distribution_major_version' from source: facts 30583 1726853671.23275: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853671.23286: _execute() done 30583 1726853671.23294: dumping result to json 30583 1726853671.23302: done dumping result, returning 30583 1726853671.23312: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-00000000018e] 30583 1726853671.23321: sending task result for task 02083763-bbaf-05ea-abc5-00000000018e 30583 1726853671.23463: no more pending results, returning what we have 30583 1726853671.23468: in VariableManager get_vars() 30583 1726853671.23586: Calling all_inventory to load vars for managed_node2 30583 1726853671.23590: Calling groups_inventory to load vars for managed_node2 30583 1726853671.23594: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853671.23607: Calling all_plugins_play to load vars for managed_node2 30583 1726853671.23610: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853671.23613: Calling groups_plugins_play to load vars for managed_node2 30583 1726853671.23978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853671.24273: done with get_vars() 30583 1726853671.24281: variable 'ansible_search_path' from source: unknown 30583 1726853671.24282: variable 'ansible_search_path' from source: unknown 30583 1726853671.24460: variable 'omit' from source: magic vars 30583 1726853671.24501: variable 'omit' from source: magic vars 30583 1726853671.24516: variable 'omit' from source: magic vars 30583 1726853671.24520: we have included files to process 30583 1726853671.24521: generating all_blocks data 30583 1726853671.24522: done generating all_blocks data 30583 1726853671.24523: processing included file: fedora.linux_system_roles.network 30583 1726853671.24545: in VariableManager get_vars() 30583 1726853671.24556: done with get_vars() 30583 1726853671.24604: done sending task result for task 02083763-bbaf-05ea-abc5-00000000018e 30583 1726853671.24609: WORKER PROCESS EXITING 30583 1726853671.24641: in VariableManager get_vars() 30583 1726853671.24652: done with get_vars() 30583 1726853671.24687: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853671.24855: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853671.24940: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853671.25324: in VariableManager get_vars() 30583 1726853671.25336: done with get_vars() 30583 1726853671.25617: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853671.26811: iterating over new_blocks loaded from include file 30583 1726853671.26813: in VariableManager get_vars() 30583 1726853671.26828: done with get_vars() 30583 1726853671.26830: filtering new block on tags 30583 1726853671.27088: done filtering new block on tags 30583 1726853671.27092: in VariableManager get_vars() 30583 1726853671.27106: done with get_vars() 30583 1726853671.27108: filtering new block on tags 30583 1726853671.27124: done filtering new block on tags 30583 1726853671.27126: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853671.27130: extending task lists for all hosts with included blocks 30583 1726853671.27289: done extending task lists 30583 1726853671.27290: done processing included files 30583 1726853671.27291: results queue empty 30583 1726853671.27291: checking for any_errors_fatal 30583 1726853671.27295: done checking for any_errors_fatal 30583 1726853671.27295: checking for max_fail_percentage 30583 1726853671.27296: done checking for max_fail_percentage 30583 1726853671.27297: checking to see if all hosts have failed and the running result is not ok 30583 1726853671.27298: done checking to see if all hosts have failed 30583 1726853671.27298: getting the remaining hosts for this loop 30583 1726853671.27300: done getting the remaining hosts for this loop 30583 1726853671.27302: getting the next task for host managed_node2 30583 1726853671.27307: done getting next task for host managed_node2 30583 1726853671.27309: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853671.27313: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853671.27321: getting variables 30583 1726853671.27322: in VariableManager get_vars() 30583 1726853671.27333: Calling all_inventory to load vars for managed_node2 30583 1726853671.27335: Calling groups_inventory to load vars for managed_node2 30583 1726853671.27337: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853671.27342: Calling all_plugins_play to load vars for managed_node2 30583 1726853671.27344: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853671.27346: Calling groups_plugins_play to load vars for managed_node2 30583 1726853671.27488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853671.27615: done with get_vars() 30583 1726853671.27621: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:34:31 -0400 (0:00:00.058) 0:00:06.613 ****** 30583 1726853671.27670: entering _queue_task() for managed_node2/include_tasks 30583 1726853671.27892: worker is 1 (out of 1 available) 30583 1726853671.27904: exiting _queue_task() for managed_node2/include_tasks 30583 1726853671.27915: done queuing things up, now waiting for results queue to drain 30583 1726853671.27917: waiting for pending results... 30583 1726853671.28080: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853671.28158: in run() - task 02083763-bbaf-05ea-abc5-00000000020c 30583 1726853671.28173: variable 'ansible_search_path' from source: unknown 30583 1726853671.28181: variable 'ansible_search_path' from source: unknown 30583 1726853671.28208: calling self._execute() 30583 1726853671.28273: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853671.28278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853671.28286: variable 'omit' from source: magic vars 30583 1726853671.28540: variable 'ansible_distribution_major_version' from source: facts 30583 1726853671.28549: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853671.28555: _execute() done 30583 1726853671.28561: dumping result to json 30583 1726853671.28564: done dumping result, returning 30583 1726853671.28573: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-00000000020c] 30583 1726853671.28576: sending task result for task 02083763-bbaf-05ea-abc5-00000000020c 30583 1726853671.28658: done sending task result for task 02083763-bbaf-05ea-abc5-00000000020c 30583 1726853671.28660: WORKER PROCESS EXITING 30583 1726853671.28726: no more pending results, returning what we have 30583 1726853671.28730: in VariableManager get_vars() 30583 1726853671.28762: Calling all_inventory to load vars for managed_node2 30583 1726853671.28764: Calling groups_inventory to load vars for managed_node2 30583 1726853671.28766: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853671.28777: Calling all_plugins_play to load vars for managed_node2 30583 1726853671.28779: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853671.28781: Calling groups_plugins_play to load vars for managed_node2 30583 1726853671.28903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853671.29034: done with get_vars() 30583 1726853671.29040: variable 'ansible_search_path' from source: unknown 30583 1726853671.29040: variable 'ansible_search_path' from source: unknown 30583 1726853671.29067: we have included files to process 30583 1726853671.29068: generating all_blocks data 30583 1726853671.29069: done generating all_blocks data 30583 1726853671.29073: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853671.29074: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853671.29075: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853671.29504: done processing included file 30583 1726853671.29505: iterating over new_blocks loaded from include file 30583 1726853671.29506: in VariableManager get_vars() 30583 1726853671.29520: done with get_vars() 30583 1726853671.29521: filtering new block on tags 30583 1726853671.29538: done filtering new block on tags 30583 1726853671.29540: in VariableManager get_vars() 30583 1726853671.29551: done with get_vars() 30583 1726853671.29552: filtering new block on tags 30583 1726853671.29580: done filtering new block on tags 30583 1726853671.29581: in VariableManager get_vars() 30583 1726853671.29596: done with get_vars() 30583 1726853671.29597: filtering new block on tags 30583 1726853671.29621: done filtering new block on tags 30583 1726853671.29622: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853671.29625: extending task lists for all hosts with included blocks 30583 1726853671.30554: done extending task lists 30583 1726853671.30556: done processing included files 30583 1726853671.30557: results queue empty 30583 1726853671.30557: checking for any_errors_fatal 30583 1726853671.30559: done checking for any_errors_fatal 30583 1726853671.30559: checking for max_fail_percentage 30583 1726853671.30560: done checking for max_fail_percentage 30583 1726853671.30560: checking to see if all hosts have failed and the running result is not ok 30583 1726853671.30561: done checking to see if all hosts have failed 30583 1726853671.30561: getting the remaining hosts for this loop 30583 1726853671.30562: done getting the remaining hosts for this loop 30583 1726853671.30564: getting the next task for host managed_node2 30583 1726853671.30568: done getting next task for host managed_node2 30583 1726853671.30572: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853671.30575: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853671.30582: getting variables 30583 1726853671.30582: in VariableManager get_vars() 30583 1726853671.30591: Calling all_inventory to load vars for managed_node2 30583 1726853671.30592: Calling groups_inventory to load vars for managed_node2 30583 1726853671.30593: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853671.30597: Calling all_plugins_play to load vars for managed_node2 30583 1726853671.30598: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853671.30600: Calling groups_plugins_play to load vars for managed_node2 30583 1726853671.30680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853671.30794: done with get_vars() 30583 1726853671.30801: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:34:31 -0400 (0:00:00.031) 0:00:06.645 ****** 30583 1726853671.30845: entering _queue_task() for managed_node2/setup 30583 1726853671.31048: worker is 1 (out of 1 available) 30583 1726853671.31062: exiting _queue_task() for managed_node2/setup 30583 1726853671.31077: done queuing things up, now waiting for results queue to drain 30583 1726853671.31078: waiting for pending results... 30583 1726853671.31228: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853671.31317: in run() - task 02083763-bbaf-05ea-abc5-000000000269 30583 1726853671.31329: variable 'ansible_search_path' from source: unknown 30583 1726853671.31333: variable 'ansible_search_path' from source: unknown 30583 1726853671.31362: calling self._execute() 30583 1726853671.31424: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853671.31428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853671.31437: variable 'omit' from source: magic vars 30583 1726853671.31685: variable 'ansible_distribution_major_version' from source: facts 30583 1726853671.31693: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853671.31864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853671.33364: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853671.33410: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853671.33436: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853671.33461: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853671.33495: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853671.33549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853671.33570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853671.33595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853671.33619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853671.33630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853671.33665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853671.33683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853671.33703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853671.33728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853671.33738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853671.33840: variable '__network_required_facts' from source: role '' defaults 30583 1726853671.33846: variable 'ansible_facts' from source: unknown 30583 1726853671.33898: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853671.33902: when evaluation is False, skipping this task 30583 1726853671.33906: _execute() done 30583 1726853671.33908: dumping result to json 30583 1726853671.33912: done dumping result, returning 30583 1726853671.33915: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-000000000269] 30583 1726853671.33918: sending task result for task 02083763-bbaf-05ea-abc5-000000000269 30583 1726853671.34003: done sending task result for task 02083763-bbaf-05ea-abc5-000000000269 30583 1726853671.34005: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853671.34070: no more pending results, returning what we have 30583 1726853671.34074: results queue empty 30583 1726853671.34075: checking for any_errors_fatal 30583 1726853671.34077: done checking for any_errors_fatal 30583 1726853671.34077: checking for max_fail_percentage 30583 1726853671.34079: done checking for max_fail_percentage 30583 1726853671.34080: checking to see if all hosts have failed and the running result is not ok 30583 1726853671.34080: done checking to see if all hosts have failed 30583 1726853671.34081: getting the remaining hosts for this loop 30583 1726853671.34083: done getting the remaining hosts for this loop 30583 1726853671.34087: getting the next task for host managed_node2 30583 1726853671.34096: done getting next task for host managed_node2 30583 1726853671.34100: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853671.34104: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853671.34117: getting variables 30583 1726853671.34118: in VariableManager get_vars() 30583 1726853671.34147: Calling all_inventory to load vars for managed_node2 30583 1726853671.34149: Calling groups_inventory to load vars for managed_node2 30583 1726853671.34151: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853671.34162: Calling all_plugins_play to load vars for managed_node2 30583 1726853671.34164: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853671.34174: Calling groups_plugins_play to load vars for managed_node2 30583 1726853671.34317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853671.34441: done with get_vars() 30583 1726853671.34449: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:34:31 -0400 (0:00:00.036) 0:00:06.682 ****** 30583 1726853671.34516: entering _queue_task() for managed_node2/stat 30583 1726853671.34717: worker is 1 (out of 1 available) 30583 1726853671.34731: exiting _queue_task() for managed_node2/stat 30583 1726853671.34743: done queuing things up, now waiting for results queue to drain 30583 1726853671.34745: waiting for pending results... 30583 1726853671.34907: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853671.35003: in run() - task 02083763-bbaf-05ea-abc5-00000000026b 30583 1726853671.35013: variable 'ansible_search_path' from source: unknown 30583 1726853671.35017: variable 'ansible_search_path' from source: unknown 30583 1726853671.35046: calling self._execute() 30583 1726853671.35111: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853671.35115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853671.35123: variable 'omit' from source: magic vars 30583 1726853671.35380: variable 'ansible_distribution_major_version' from source: facts 30583 1726853671.35390: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853671.35503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853671.35695: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853671.35726: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853671.35752: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853671.35781: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853671.35866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853671.35886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853671.35904: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853671.35921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853671.35986: variable '__network_is_ostree' from source: set_fact 30583 1726853671.35990: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853671.35993: when evaluation is False, skipping this task 30583 1726853671.35997: _execute() done 30583 1726853671.36000: dumping result to json 30583 1726853671.36004: done dumping result, returning 30583 1726853671.36011: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-00000000026b] 30583 1726853671.36016: sending task result for task 02083763-bbaf-05ea-abc5-00000000026b 30583 1726853671.36096: done sending task result for task 02083763-bbaf-05ea-abc5-00000000026b 30583 1726853671.36099: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853671.36144: no more pending results, returning what we have 30583 1726853671.36147: results queue empty 30583 1726853671.36148: checking for any_errors_fatal 30583 1726853671.36154: done checking for any_errors_fatal 30583 1726853671.36155: checking for max_fail_percentage 30583 1726853671.36157: done checking for max_fail_percentage 30583 1726853671.36158: checking to see if all hosts have failed and the running result is not ok 30583 1726853671.36159: done checking to see if all hosts have failed 30583 1726853671.36159: getting the remaining hosts for this loop 30583 1726853671.36161: done getting the remaining hosts for this loop 30583 1726853671.36165: getting the next task for host managed_node2 30583 1726853671.36173: done getting next task for host managed_node2 30583 1726853671.36177: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853671.36181: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853671.36193: getting variables 30583 1726853671.36195: in VariableManager get_vars() 30583 1726853671.36223: Calling all_inventory to load vars for managed_node2 30583 1726853671.36225: Calling groups_inventory to load vars for managed_node2 30583 1726853671.36227: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853671.36235: Calling all_plugins_play to load vars for managed_node2 30583 1726853671.36238: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853671.36240: Calling groups_plugins_play to load vars for managed_node2 30583 1726853671.36362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853671.36487: done with get_vars() 30583 1726853671.36496: done getting variables 30583 1726853671.36534: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:34:31 -0400 (0:00:00.020) 0:00:06.702 ****** 30583 1726853671.36558: entering _queue_task() for managed_node2/set_fact 30583 1726853671.36744: worker is 1 (out of 1 available) 30583 1726853671.36757: exiting _queue_task() for managed_node2/set_fact 30583 1726853671.36769: done queuing things up, now waiting for results queue to drain 30583 1726853671.36772: waiting for pending results... 30583 1726853671.36928: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853671.37013: in run() - task 02083763-bbaf-05ea-abc5-00000000026c 30583 1726853671.37022: variable 'ansible_search_path' from source: unknown 30583 1726853671.37027: variable 'ansible_search_path' from source: unknown 30583 1726853671.37055: calling self._execute() 30583 1726853671.37113: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853671.37117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853671.37126: variable 'omit' from source: magic vars 30583 1726853671.37419: variable 'ansible_distribution_major_version' from source: facts 30583 1726853671.37427: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853671.37536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853671.37718: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853671.37747: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853671.37776: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853671.37803: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853671.37874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853671.37899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853671.37917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853671.37934: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853671.38002: variable '__network_is_ostree' from source: set_fact 30583 1726853671.38008: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853671.38011: when evaluation is False, skipping this task 30583 1726853671.38014: _execute() done 30583 1726853671.38016: dumping result to json 30583 1726853671.38019: done dumping result, returning 30583 1726853671.38027: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-00000000026c] 30583 1726853671.38029: sending task result for task 02083763-bbaf-05ea-abc5-00000000026c 30583 1726853671.38109: done sending task result for task 02083763-bbaf-05ea-abc5-00000000026c 30583 1726853671.38112: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853671.38157: no more pending results, returning what we have 30583 1726853671.38160: results queue empty 30583 1726853671.38161: checking for any_errors_fatal 30583 1726853671.38164: done checking for any_errors_fatal 30583 1726853671.38165: checking for max_fail_percentage 30583 1726853671.38167: done checking for max_fail_percentage 30583 1726853671.38168: checking to see if all hosts have failed and the running result is not ok 30583 1726853671.38169: done checking to see if all hosts have failed 30583 1726853671.38169: getting the remaining hosts for this loop 30583 1726853671.38172: done getting the remaining hosts for this loop 30583 1726853671.38176: getting the next task for host managed_node2 30583 1726853671.38185: done getting next task for host managed_node2 30583 1726853671.38188: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853671.38192: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853671.38204: getting variables 30583 1726853671.38205: in VariableManager get_vars() 30583 1726853671.38235: Calling all_inventory to load vars for managed_node2 30583 1726853671.38237: Calling groups_inventory to load vars for managed_node2 30583 1726853671.38240: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853671.38247: Calling all_plugins_play to load vars for managed_node2 30583 1726853671.38249: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853671.38252: Calling groups_plugins_play to load vars for managed_node2 30583 1726853671.38387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853671.38503: done with get_vars() 30583 1726853671.38510: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:34:31 -0400 (0:00:00.020) 0:00:06.722 ****** 30583 1726853671.38572: entering _queue_task() for managed_node2/service_facts 30583 1726853671.38574: Creating lock for service_facts 30583 1726853671.38758: worker is 1 (out of 1 available) 30583 1726853671.38774: exiting _queue_task() for managed_node2/service_facts 30583 1726853671.38786: done queuing things up, now waiting for results queue to drain 30583 1726853671.38787: waiting for pending results... 30583 1726853671.38937: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853671.39009: in run() - task 02083763-bbaf-05ea-abc5-00000000026e 30583 1726853671.39020: variable 'ansible_search_path' from source: unknown 30583 1726853671.39025: variable 'ansible_search_path' from source: unknown 30583 1726853671.39051: calling self._execute() 30583 1726853671.39111: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853671.39114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853671.39125: variable 'omit' from source: magic vars 30583 1726853671.39362: variable 'ansible_distribution_major_version' from source: facts 30583 1726853671.39366: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853671.39374: variable 'omit' from source: magic vars 30583 1726853671.39417: variable 'omit' from source: magic vars 30583 1726853671.39440: variable 'omit' from source: magic vars 30583 1726853671.39473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853671.39499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853671.39513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853671.39526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853671.39535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853671.39560: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853671.39563: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853671.39565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853671.39631: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853671.39634: Set connection var ansible_timeout to 10 30583 1726853671.39637: Set connection var ansible_connection to ssh 30583 1726853671.39643: Set connection var ansible_shell_executable to /bin/sh 30583 1726853671.39646: Set connection var ansible_shell_type to sh 30583 1726853671.39653: Set connection var ansible_pipelining to False 30583 1726853671.39673: variable 'ansible_shell_executable' from source: unknown 30583 1726853671.39682: variable 'ansible_connection' from source: unknown 30583 1726853671.39685: variable 'ansible_module_compression' from source: unknown 30583 1726853671.39687: variable 'ansible_shell_type' from source: unknown 30583 1726853671.39690: variable 'ansible_shell_executable' from source: unknown 30583 1726853671.39692: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853671.39694: variable 'ansible_pipelining' from source: unknown 30583 1726853671.39696: variable 'ansible_timeout' from source: unknown 30583 1726853671.39698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853671.39839: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853671.39846: variable 'omit' from source: magic vars 30583 1726853671.39851: starting attempt loop 30583 1726853671.39854: running the handler 30583 1726853671.39865: _low_level_execute_command(): starting 30583 1726853671.39874: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853671.40375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853671.40379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853671.40382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853671.40384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853671.40435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853671.40438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853671.40440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853671.40519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853671.42248: stdout chunk (state=3): >>>/root <<< 30583 1726853671.42347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853671.42375: stderr chunk (state=3): >>><<< 30583 1726853671.42378: stdout chunk (state=3): >>><<< 30583 1726853671.42396: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853671.42405: _low_level_execute_command(): starting 30583 1726853671.42410: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433 `" && echo ansible-tmp-1726853671.4239461-30910-55871425875433="` echo /root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433 `" ) && sleep 0' 30583 1726853671.42831: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853671.42834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853671.42837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853671.42846: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853671.42849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853671.42894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853671.42901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853671.42974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853671.45398: stdout chunk (state=3): >>>ansible-tmp-1726853671.4239461-30910-55871425875433=/root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433 <<< 30583 1726853671.45403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853671.45406: stdout chunk (state=3): >>><<< 30583 1726853671.45408: stderr chunk (state=3): >>><<< 30583 1726853671.45410: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853671.4239461-30910-55871425875433=/root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853671.45412: variable 'ansible_module_compression' from source: unknown 30583 1726853671.45443: ANSIBALLZ: Using lock for service_facts 30583 1726853671.45451: ANSIBALLZ: Acquiring lock 30583 1726853671.45457: ANSIBALLZ: Lock acquired: 139827453301280 30583 1726853671.45465: ANSIBALLZ: Creating module 30583 1726853671.62232: ANSIBALLZ: Writing module into payload 30583 1726853671.62302: ANSIBALLZ: Writing module 30583 1726853671.62319: ANSIBALLZ: Renaming module 30583 1726853671.62331: ANSIBALLZ: Done creating module 30583 1726853671.62351: variable 'ansible_facts' from source: unknown 30583 1726853671.62421: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433/AnsiballZ_service_facts.py 30583 1726853671.62663: Sending initial data 30583 1726853671.62666: Sent initial data (161 bytes) 30583 1726853671.63350: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853671.63383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853671.63386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853671.63435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853671.63439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853671.63526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853671.65231: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30583 1726853671.65238: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853671.65301: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853671.65410: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpcda4jl2i /root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433/AnsiballZ_service_facts.py <<< 30583 1726853671.65413: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433/AnsiballZ_service_facts.py" <<< 30583 1726853671.65497: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpcda4jl2i" to remote "/root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433/AnsiballZ_service_facts.py" <<< 30583 1726853671.66418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853671.66456: stderr chunk (state=3): >>><<< 30583 1726853671.66480: stdout chunk (state=3): >>><<< 30583 1726853671.66566: done transferring module to remote 30583 1726853671.66570: _low_level_execute_command(): starting 30583 1726853671.66575: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433/ /root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433/AnsiballZ_service_facts.py && sleep 0' 30583 1726853671.67014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853671.67017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853671.67019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853671.67025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853671.67027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853671.67073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853671.67082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853671.67150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853671.69116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853671.69119: stderr chunk (state=3): >>><<< 30583 1726853671.69121: stdout chunk (state=3): >>><<< 30583 1726853671.69214: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853671.69219: _low_level_execute_command(): starting 30583 1726853671.69222: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433/AnsiballZ_service_facts.py && sleep 0' 30583 1726853671.69805: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853671.69822: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853671.69838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853671.69859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853671.69924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853671.69936: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853671.69996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853671.70021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853671.70060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853671.70182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853673.39454: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30583 1726853673.39467: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 30583 1726853673.39475: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 30583 1726853673.39478: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 30583 1726853673.39482: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 30583 1726853673.39485: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853673.41039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853673.41085: stderr chunk (state=3): >>><<< 30583 1726853673.41089: stdout chunk (state=3): >>><<< 30583 1726853673.41112: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853673.41841: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853673.41845: _low_level_execute_command(): starting 30583 1726853673.41847: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853671.4239461-30910-55871425875433/ > /dev/null 2>&1 && sleep 0' 30583 1726853673.42382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853673.42397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853673.42410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853673.42428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853673.42445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853673.42456: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853673.42470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853673.42496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853673.42508: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853673.42518: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853673.42529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853673.42542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853673.42558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853673.42570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853673.42650: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853673.42673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853673.42690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853673.42795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853673.44758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853673.44780: stdout chunk (state=3): >>><<< 30583 1726853673.44792: stderr chunk (state=3): >>><<< 30583 1726853673.44811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853673.44823: handler run complete 30583 1726853673.45026: variable 'ansible_facts' from source: unknown 30583 1726853673.45203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853673.45702: variable 'ansible_facts' from source: unknown 30583 1726853673.45876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853673.46150: attempt loop complete, returning result 30583 1726853673.46153: _execute() done 30583 1726853673.46158: dumping result to json 30583 1726853673.46161: done dumping result, returning 30583 1726853673.46163: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-00000000026e] 30583 1726853673.46166: sending task result for task 02083763-bbaf-05ea-abc5-00000000026e 30583 1726853673.47266: done sending task result for task 02083763-bbaf-05ea-abc5-00000000026e 30583 1726853673.47269: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853673.47310: no more pending results, returning what we have 30583 1726853673.47312: results queue empty 30583 1726853673.47314: checking for any_errors_fatal 30583 1726853673.47316: done checking for any_errors_fatal 30583 1726853673.47317: checking for max_fail_percentage 30583 1726853673.47318: done checking for max_fail_percentage 30583 1726853673.47319: checking to see if all hosts have failed and the running result is not ok 30583 1726853673.47319: done checking to see if all hosts have failed 30583 1726853673.47320: getting the remaining hosts for this loop 30583 1726853673.47321: done getting the remaining hosts for this loop 30583 1726853673.47323: getting the next task for host managed_node2 30583 1726853673.47327: done getting next task for host managed_node2 30583 1726853673.47329: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853673.47333: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853673.47346: getting variables 30583 1726853673.47347: in VariableManager get_vars() 30583 1726853673.47367: Calling all_inventory to load vars for managed_node2 30583 1726853673.47369: Calling groups_inventory to load vars for managed_node2 30583 1726853673.47370: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853673.47379: Calling all_plugins_play to load vars for managed_node2 30583 1726853673.47381: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853673.47382: Calling groups_plugins_play to load vars for managed_node2 30583 1726853673.47595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853673.47876: done with get_vars() 30583 1726853673.47886: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:34:33 -0400 (0:00:02.093) 0:00:08.816 ****** 30583 1726853673.47950: entering _queue_task() for managed_node2/package_facts 30583 1726853673.47952: Creating lock for package_facts 30583 1726853673.48183: worker is 1 (out of 1 available) 30583 1726853673.48198: exiting _queue_task() for managed_node2/package_facts 30583 1726853673.48210: done queuing things up, now waiting for results queue to drain 30583 1726853673.48211: waiting for pending results... 30583 1726853673.48389: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853673.48484: in run() - task 02083763-bbaf-05ea-abc5-00000000026f 30583 1726853673.48496: variable 'ansible_search_path' from source: unknown 30583 1726853673.48501: variable 'ansible_search_path' from source: unknown 30583 1726853673.48529: calling self._execute() 30583 1726853673.48598: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853673.48601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853673.48610: variable 'omit' from source: magic vars 30583 1726853673.48904: variable 'ansible_distribution_major_version' from source: facts 30583 1726853673.48908: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853673.48911: variable 'omit' from source: magic vars 30583 1726853673.49176: variable 'omit' from source: magic vars 30583 1726853673.49178: variable 'omit' from source: magic vars 30583 1726853673.49181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853673.49184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853673.49186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853673.49188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853673.49190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853673.49192: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853673.49199: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853673.49209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853673.49315: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853673.49336: Set connection var ansible_timeout to 10 30583 1726853673.49345: Set connection var ansible_connection to ssh 30583 1726853673.49356: Set connection var ansible_shell_executable to /bin/sh 30583 1726853673.49363: Set connection var ansible_shell_type to sh 30583 1726853673.49379: Set connection var ansible_pipelining to False 30583 1726853673.49405: variable 'ansible_shell_executable' from source: unknown 30583 1726853673.49414: variable 'ansible_connection' from source: unknown 30583 1726853673.49421: variable 'ansible_module_compression' from source: unknown 30583 1726853673.49436: variable 'ansible_shell_type' from source: unknown 30583 1726853673.49444: variable 'ansible_shell_executable' from source: unknown 30583 1726853673.49450: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853673.49456: variable 'ansible_pipelining' from source: unknown 30583 1726853673.49462: variable 'ansible_timeout' from source: unknown 30583 1726853673.49469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853673.49672: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853673.49690: variable 'omit' from source: magic vars 30583 1726853673.49700: starting attempt loop 30583 1726853673.49762: running the handler 30583 1726853673.49765: _low_level_execute_command(): starting 30583 1726853673.49768: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853673.50319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853673.50337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853673.50357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853673.50389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853673.50402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853673.50482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853673.52233: stdout chunk (state=3): >>>/root <<< 30583 1726853673.52388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853673.52393: stdout chunk (state=3): >>><<< 30583 1726853673.52396: stderr chunk (state=3): >>><<< 30583 1726853673.52506: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853673.52510: _low_level_execute_command(): starting 30583 1726853673.52513: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385 `" && echo ansible-tmp-1726853673.5241911-31005-23241441244385="` echo /root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385 `" ) && sleep 0' 30583 1726853673.53080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853673.53094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853673.53109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853673.53143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853673.53169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853673.53199: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853673.53215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853673.53243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853673.53291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853673.53294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853673.53379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853673.55580: stdout chunk (state=3): >>>ansible-tmp-1726853673.5241911-31005-23241441244385=/root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385 <<< 30583 1726853673.55805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853673.55808: stdout chunk (state=3): >>><<< 30583 1726853673.55817: stderr chunk (state=3): >>><<< 30583 1726853673.55833: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853673.5241911-31005-23241441244385=/root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853673.56039: variable 'ansible_module_compression' from source: unknown 30583 1726853673.56042: ANSIBALLZ: Using lock for package_facts 30583 1726853673.56045: ANSIBALLZ: Acquiring lock 30583 1726853673.56048: ANSIBALLZ: Lock acquired: 139827451480784 30583 1726853673.56050: ANSIBALLZ: Creating module 30583 1726853673.85117: ANSIBALLZ: Writing module into payload 30583 1726853673.85278: ANSIBALLZ: Writing module 30583 1726853673.85313: ANSIBALLZ: Renaming module 30583 1726853673.85319: ANSIBALLZ: Done creating module 30583 1726853673.85362: variable 'ansible_facts' from source: unknown 30583 1726853673.85585: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385/AnsiballZ_package_facts.py 30583 1726853673.85699: Sending initial data 30583 1726853673.85703: Sent initial data (161 bytes) 30583 1726853673.86167: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853673.86188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853673.86199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853673.86286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853673.88030: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853673.88094: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853673.88184: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpz6s7bf2z /root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385/AnsiballZ_package_facts.py <<< 30583 1726853673.88187: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385/AnsiballZ_package_facts.py" <<< 30583 1726853673.88249: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpz6s7bf2z" to remote "/root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385/AnsiballZ_package_facts.py" <<< 30583 1726853673.89770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853673.89842: stderr chunk (state=3): >>><<< 30583 1726853673.89845: stdout chunk (state=3): >>><<< 30583 1726853673.89848: done transferring module to remote 30583 1726853673.89850: _low_level_execute_command(): starting 30583 1726853673.89852: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385/ /root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385/AnsiballZ_package_facts.py && sleep 0' 30583 1726853673.90240: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853673.90270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853673.90276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853673.90278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853673.90280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853673.90286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853673.90330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853673.90333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853673.90450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853673.92349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853673.92372: stderr chunk (state=3): >>><<< 30583 1726853673.92375: stdout chunk (state=3): >>><<< 30583 1726853673.92387: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853673.92390: _low_level_execute_command(): starting 30583 1726853673.92394: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385/AnsiballZ_package_facts.py && sleep 0' 30583 1726853673.92816: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853673.92820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853673.92822: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853673.92825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853673.92827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853673.92881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853673.92898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853673.93018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853674.38626: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30583 1726853674.38648: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30583 1726853674.38652: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30583 1726853674.38668: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30583 1726853674.38682: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30583 1726853674.38705: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30583 1726853674.38728: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30583 1726853674.38753: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30583 1726853674.38757: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30583 1726853674.38786: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30583 1726853674.38800: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30583 1726853674.38823: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30583 1726853674.38836: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30583 1726853674.38845: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853674.40716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853674.40776: stderr chunk (state=3): >>><<< 30583 1726853674.40781: stdout chunk (state=3): >>><<< 30583 1726853674.40955: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853674.42883: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853674.42900: _low_level_execute_command(): starting 30583 1726853674.42905: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853673.5241911-31005-23241441244385/ > /dev/null 2>&1 && sleep 0' 30583 1726853674.43359: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853674.43362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853674.43365: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853674.43367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853674.43369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853674.43412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853674.43415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853674.43421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853674.43496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853674.45577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853674.45580: stdout chunk (state=3): >>><<< 30583 1726853674.45583: stderr chunk (state=3): >>><<< 30583 1726853674.45585: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853674.45587: handler run complete 30583 1726853674.46309: variable 'ansible_facts' from source: unknown 30583 1726853674.46758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853674.48814: variable 'ansible_facts' from source: unknown 30583 1726853674.49235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853674.49973: attempt loop complete, returning result 30583 1726853674.49983: _execute() done 30583 1726853674.49986: dumping result to json 30583 1726853674.50207: done dumping result, returning 30583 1726853674.50216: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-00000000026f] 30583 1726853674.50221: sending task result for task 02083763-bbaf-05ea-abc5-00000000026f 30583 1726853674.52390: done sending task result for task 02083763-bbaf-05ea-abc5-00000000026f 30583 1726853674.52393: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853674.52493: no more pending results, returning what we have 30583 1726853674.52496: results queue empty 30583 1726853674.52497: checking for any_errors_fatal 30583 1726853674.52503: done checking for any_errors_fatal 30583 1726853674.52504: checking for max_fail_percentage 30583 1726853674.52505: done checking for max_fail_percentage 30583 1726853674.52506: checking to see if all hosts have failed and the running result is not ok 30583 1726853674.52507: done checking to see if all hosts have failed 30583 1726853674.52507: getting the remaining hosts for this loop 30583 1726853674.52509: done getting the remaining hosts for this loop 30583 1726853674.52512: getting the next task for host managed_node2 30583 1726853674.52519: done getting next task for host managed_node2 30583 1726853674.52522: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853674.52527: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853674.52536: getting variables 30583 1726853674.52537: in VariableManager get_vars() 30583 1726853674.52565: Calling all_inventory to load vars for managed_node2 30583 1726853674.52568: Calling groups_inventory to load vars for managed_node2 30583 1726853674.52573: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853674.52582: Calling all_plugins_play to load vars for managed_node2 30583 1726853674.52584: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853674.52587: Calling groups_plugins_play to load vars for managed_node2 30583 1726853674.53917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853674.55344: done with get_vars() 30583 1726853674.55378: done getting variables 30583 1726853674.55439: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:34:34 -0400 (0:00:01.075) 0:00:09.891 ****** 30583 1726853674.55478: entering _queue_task() for managed_node2/debug 30583 1726853674.55799: worker is 1 (out of 1 available) 30583 1726853674.55813: exiting _queue_task() for managed_node2/debug 30583 1726853674.55826: done queuing things up, now waiting for results queue to drain 30583 1726853674.55827: waiting for pending results... 30583 1726853674.56290: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853674.56295: in run() - task 02083763-bbaf-05ea-abc5-00000000020d 30583 1726853674.56298: variable 'ansible_search_path' from source: unknown 30583 1726853674.56300: variable 'ansible_search_path' from source: unknown 30583 1726853674.56310: calling self._execute() 30583 1726853674.56397: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853674.56412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853674.56431: variable 'omit' from source: magic vars 30583 1726853674.56809: variable 'ansible_distribution_major_version' from source: facts 30583 1726853674.56824: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853674.56835: variable 'omit' from source: magic vars 30583 1726853674.56911: variable 'omit' from source: magic vars 30583 1726853674.57019: variable 'network_provider' from source: set_fact 30583 1726853674.57045: variable 'omit' from source: magic vars 30583 1726853674.57101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853674.57138: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853674.57164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853674.57190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853674.57208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853674.57243: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853674.57253: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853674.57266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853674.57395: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853674.57399: Set connection var ansible_timeout to 10 30583 1726853674.57504: Set connection var ansible_connection to ssh 30583 1726853674.57509: Set connection var ansible_shell_executable to /bin/sh 30583 1726853674.57511: Set connection var ansible_shell_type to sh 30583 1726853674.57513: Set connection var ansible_pipelining to False 30583 1726853674.57515: variable 'ansible_shell_executable' from source: unknown 30583 1726853674.57518: variable 'ansible_connection' from source: unknown 30583 1726853674.57520: variable 'ansible_module_compression' from source: unknown 30583 1726853674.57522: variable 'ansible_shell_type' from source: unknown 30583 1726853674.57525: variable 'ansible_shell_executable' from source: unknown 30583 1726853674.57527: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853674.57528: variable 'ansible_pipelining' from source: unknown 30583 1726853674.57530: variable 'ansible_timeout' from source: unknown 30583 1726853674.57532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853674.57666: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853674.57685: variable 'omit' from source: magic vars 30583 1726853674.57695: starting attempt loop 30583 1726853674.57701: running the handler 30583 1726853674.57751: handler run complete 30583 1726853674.57773: attempt loop complete, returning result 30583 1726853674.57780: _execute() done 30583 1726853674.57786: dumping result to json 30583 1726853674.57793: done dumping result, returning 30583 1726853674.57805: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-00000000020d] 30583 1726853674.57814: sending task result for task 02083763-bbaf-05ea-abc5-00000000020d ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853674.57995: no more pending results, returning what we have 30583 1726853674.57999: results queue empty 30583 1726853674.58000: checking for any_errors_fatal 30583 1726853674.58010: done checking for any_errors_fatal 30583 1726853674.58011: checking for max_fail_percentage 30583 1726853674.58013: done checking for max_fail_percentage 30583 1726853674.58014: checking to see if all hosts have failed and the running result is not ok 30583 1726853674.58015: done checking to see if all hosts have failed 30583 1726853674.58016: getting the remaining hosts for this loop 30583 1726853674.58018: done getting the remaining hosts for this loop 30583 1726853674.58022: getting the next task for host managed_node2 30583 1726853674.58031: done getting next task for host managed_node2 30583 1726853674.58036: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853674.58041: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853674.58052: getting variables 30583 1726853674.58054: in VariableManager get_vars() 30583 1726853674.58095: Calling all_inventory to load vars for managed_node2 30583 1726853674.58098: Calling groups_inventory to load vars for managed_node2 30583 1726853674.58101: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853674.58110: Calling all_plugins_play to load vars for managed_node2 30583 1726853674.58113: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853674.58116: Calling groups_plugins_play to load vars for managed_node2 30583 1726853674.58783: done sending task result for task 02083763-bbaf-05ea-abc5-00000000020d 30583 1726853674.58787: WORKER PROCESS EXITING 30583 1726853674.60521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853674.63693: done with get_vars() 30583 1726853674.63722: done getting variables 30583 1726853674.63818: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:34:34 -0400 (0:00:00.083) 0:00:09.975 ****** 30583 1726853674.63864: entering _queue_task() for managed_node2/fail 30583 1726853674.63866: Creating lock for fail 30583 1726853674.64551: worker is 1 (out of 1 available) 30583 1726853674.64570: exiting _queue_task() for managed_node2/fail 30583 1726853674.64585: done queuing things up, now waiting for results queue to drain 30583 1726853674.64587: waiting for pending results... 30583 1726853674.64889: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853674.65336: in run() - task 02083763-bbaf-05ea-abc5-00000000020e 30583 1726853674.65354: variable 'ansible_search_path' from source: unknown 30583 1726853674.65365: variable 'ansible_search_path' from source: unknown 30583 1726853674.65516: calling self._execute() 30583 1726853674.65634: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853674.65778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853674.65781: variable 'omit' from source: magic vars 30583 1726853674.66495: variable 'ansible_distribution_major_version' from source: facts 30583 1726853674.66502: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853674.66733: variable 'network_state' from source: role '' defaults 30583 1726853674.66790: Evaluated conditional (network_state != {}): False 30583 1726853674.66798: when evaluation is False, skipping this task 30583 1726853674.66805: _execute() done 30583 1726853674.66811: dumping result to json 30583 1726853674.66823: done dumping result, returning 30583 1726853674.66835: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-00000000020e] 30583 1726853674.67078: sending task result for task 02083763-bbaf-05ea-abc5-00000000020e 30583 1726853674.67151: done sending task result for task 02083763-bbaf-05ea-abc5-00000000020e 30583 1726853674.67157: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853674.67230: no more pending results, returning what we have 30583 1726853674.67235: results queue empty 30583 1726853674.67236: checking for any_errors_fatal 30583 1726853674.67243: done checking for any_errors_fatal 30583 1726853674.67244: checking for max_fail_percentage 30583 1726853674.67246: done checking for max_fail_percentage 30583 1726853674.67247: checking to see if all hosts have failed and the running result is not ok 30583 1726853674.67248: done checking to see if all hosts have failed 30583 1726853674.67249: getting the remaining hosts for this loop 30583 1726853674.67251: done getting the remaining hosts for this loop 30583 1726853674.67257: getting the next task for host managed_node2 30583 1726853674.67265: done getting next task for host managed_node2 30583 1726853674.67270: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853674.67277: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853674.67293: getting variables 30583 1726853674.67295: in VariableManager get_vars() 30583 1726853674.67333: Calling all_inventory to load vars for managed_node2 30583 1726853674.67336: Calling groups_inventory to load vars for managed_node2 30583 1726853674.67339: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853674.67351: Calling all_plugins_play to load vars for managed_node2 30583 1726853674.67357: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853674.67361: Calling groups_plugins_play to load vars for managed_node2 30583 1726853674.69183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853674.70686: done with get_vars() 30583 1726853674.70714: done getting variables 30583 1726853674.70781: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:34:34 -0400 (0:00:00.069) 0:00:10.045 ****** 30583 1726853674.70816: entering _queue_task() for managed_node2/fail 30583 1726853674.71347: worker is 1 (out of 1 available) 30583 1726853674.71362: exiting _queue_task() for managed_node2/fail 30583 1726853674.71379: done queuing things up, now waiting for results queue to drain 30583 1726853674.71380: waiting for pending results... 30583 1726853674.71915: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853674.72144: in run() - task 02083763-bbaf-05ea-abc5-00000000020f 30583 1726853674.72276: variable 'ansible_search_path' from source: unknown 30583 1726853674.72281: variable 'ansible_search_path' from source: unknown 30583 1726853674.72284: calling self._execute() 30583 1726853674.72451: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853674.72669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853674.72674: variable 'omit' from source: magic vars 30583 1726853674.73253: variable 'ansible_distribution_major_version' from source: facts 30583 1726853674.73272: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853674.73565: variable 'network_state' from source: role '' defaults 30583 1726853674.73568: Evaluated conditional (network_state != {}): False 30583 1726853674.73578: when evaluation is False, skipping this task 30583 1726853674.73581: _execute() done 30583 1726853674.73589: dumping result to json 30583 1726853674.73681: done dumping result, returning 30583 1726853674.73695: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-00000000020f] 30583 1726853674.73706: sending task result for task 02083763-bbaf-05ea-abc5-00000000020f skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853674.73860: no more pending results, returning what we have 30583 1726853674.73864: results queue empty 30583 1726853674.73865: checking for any_errors_fatal 30583 1726853674.73876: done checking for any_errors_fatal 30583 1726853674.73877: checking for max_fail_percentage 30583 1726853674.73879: done checking for max_fail_percentage 30583 1726853674.73880: checking to see if all hosts have failed and the running result is not ok 30583 1726853674.73880: done checking to see if all hosts have failed 30583 1726853674.73881: getting the remaining hosts for this loop 30583 1726853674.73883: done getting the remaining hosts for this loop 30583 1726853674.73886: getting the next task for host managed_node2 30583 1726853674.73895: done getting next task for host managed_node2 30583 1726853674.73899: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853674.73904: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853674.73918: getting variables 30583 1726853674.73920: in VariableManager get_vars() 30583 1726853674.73961: Calling all_inventory to load vars for managed_node2 30583 1726853674.73965: Calling groups_inventory to load vars for managed_node2 30583 1726853674.73967: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853674.74092: Calling all_plugins_play to load vars for managed_node2 30583 1726853674.74098: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853674.74178: Calling groups_plugins_play to load vars for managed_node2 30583 1726853674.74979: done sending task result for task 02083763-bbaf-05ea-abc5-00000000020f 30583 1726853674.74982: WORKER PROCESS EXITING 30583 1726853674.76187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853674.78165: done with get_vars() 30583 1726853674.78192: done getting variables 30583 1726853674.78253: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:34:34 -0400 (0:00:00.074) 0:00:10.120 ****** 30583 1726853674.78292: entering _queue_task() for managed_node2/fail 30583 1726853674.78712: worker is 1 (out of 1 available) 30583 1726853674.78724: exiting _queue_task() for managed_node2/fail 30583 1726853674.78734: done queuing things up, now waiting for results queue to drain 30583 1726853674.78736: waiting for pending results... 30583 1726853674.78944: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853674.79149: in run() - task 02083763-bbaf-05ea-abc5-000000000210 30583 1726853674.79194: variable 'ansible_search_path' from source: unknown 30583 1726853674.79213: variable 'ansible_search_path' from source: unknown 30583 1726853674.79276: calling self._execute() 30583 1726853674.79378: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853674.79389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853674.79401: variable 'omit' from source: magic vars 30583 1726853674.79774: variable 'ansible_distribution_major_version' from source: facts 30583 1726853674.79792: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853674.79966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853674.82433: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853674.82528: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853674.82576: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853674.82618: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853674.82652: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853674.82741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853674.82877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853674.82881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853674.82883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853674.82886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853674.82974: variable 'ansible_distribution_major_version' from source: facts 30583 1726853674.82996: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853674.83125: variable 'ansible_distribution' from source: facts 30583 1726853674.83133: variable '__network_rh_distros' from source: role '' defaults 30583 1726853674.83146: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853674.83406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853674.83440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853674.83475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853674.83519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853674.83546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853674.83657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853674.83660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853674.83663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853674.83691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853674.83708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853674.83751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853674.83788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853674.83814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853674.83885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853674.83910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853674.84247: variable 'network_connections' from source: include params 30583 1726853674.84266: variable 'interface' from source: play vars 30583 1726853674.84338: variable 'interface' from source: play vars 30583 1726853674.84358: variable 'network_state' from source: role '' defaults 30583 1726853674.84415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853674.84569: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853674.84617: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853674.84654: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853674.84697: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853674.84747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853674.84782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853674.84825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853674.84859: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853674.84909: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853674.84977: when evaluation is False, skipping this task 30583 1726853674.84980: _execute() done 30583 1726853674.84982: dumping result to json 30583 1726853674.84984: done dumping result, returning 30583 1726853674.84987: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-000000000210] 30583 1726853674.84989: sending task result for task 02083763-bbaf-05ea-abc5-000000000210 30583 1726853674.85176: done sending task result for task 02083763-bbaf-05ea-abc5-000000000210 30583 1726853674.85180: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853674.85230: no more pending results, returning what we have 30583 1726853674.85233: results queue empty 30583 1726853674.85234: checking for any_errors_fatal 30583 1726853674.85240: done checking for any_errors_fatal 30583 1726853674.85241: checking for max_fail_percentage 30583 1726853674.85244: done checking for max_fail_percentage 30583 1726853674.85244: checking to see if all hosts have failed and the running result is not ok 30583 1726853674.85245: done checking to see if all hosts have failed 30583 1726853674.85246: getting the remaining hosts for this loop 30583 1726853674.85248: done getting the remaining hosts for this loop 30583 1726853674.85252: getting the next task for host managed_node2 30583 1726853674.85268: done getting next task for host managed_node2 30583 1726853674.85274: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853674.85280: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853674.85296: getting variables 30583 1726853674.85298: in VariableManager get_vars() 30583 1726853674.85337: Calling all_inventory to load vars for managed_node2 30583 1726853674.85340: Calling groups_inventory to load vars for managed_node2 30583 1726853674.85343: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853674.85354: Calling all_plugins_play to load vars for managed_node2 30583 1726853674.85360: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853674.85364: Calling groups_plugins_play to load vars for managed_node2 30583 1726853674.88227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853674.90241: done with get_vars() 30583 1726853674.90276: done getting variables 30583 1726853674.90377: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:34:34 -0400 (0:00:00.121) 0:00:10.241 ****** 30583 1726853674.90411: entering _queue_task() for managed_node2/dnf 30583 1726853674.90749: worker is 1 (out of 1 available) 30583 1726853674.90764: exiting _queue_task() for managed_node2/dnf 30583 1726853674.90779: done queuing things up, now waiting for results queue to drain 30583 1726853674.90781: waiting for pending results... 30583 1726853674.91091: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853674.91198: in run() - task 02083763-bbaf-05ea-abc5-000000000211 30583 1726853674.91217: variable 'ansible_search_path' from source: unknown 30583 1726853674.91225: variable 'ansible_search_path' from source: unknown 30583 1726853674.91266: calling self._execute() 30583 1726853674.91370: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853674.91385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853674.91406: variable 'omit' from source: magic vars 30583 1726853674.91767: variable 'ansible_distribution_major_version' from source: facts 30583 1726853674.91847: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853674.92029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853674.94329: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853674.94426: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853674.94577: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853674.94580: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853674.94582: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853674.94622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853674.94654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853674.94687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853674.94732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853674.94750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853674.94873: variable 'ansible_distribution' from source: facts 30583 1726853674.94884: variable 'ansible_distribution_major_version' from source: facts 30583 1726853674.94904: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853674.95022: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853674.95148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853674.95239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853674.95242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853674.95244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853674.95264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853674.95306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853674.95334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853674.95369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853674.95413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853674.95430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853674.95480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853674.95508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853674.95535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853674.95587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853674.95675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853674.95766: variable 'network_connections' from source: include params 30583 1726853674.95790: variable 'interface' from source: play vars 30583 1726853674.95867: variable 'interface' from source: play vars 30583 1726853674.95943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853674.96127: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853674.96175: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853674.96219: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853674.96261: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853674.96312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853674.96342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853674.96436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853674.96440: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853674.96489: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853674.97127: variable 'network_connections' from source: include params 30583 1726853674.97137: variable 'interface' from source: play vars 30583 1726853674.97209: variable 'interface' from source: play vars 30583 1726853674.97376: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853674.97379: when evaluation is False, skipping this task 30583 1726853674.97381: _execute() done 30583 1726853674.97384: dumping result to json 30583 1726853674.97386: done dumping result, returning 30583 1726853674.97388: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000211] 30583 1726853674.97390: sending task result for task 02083763-bbaf-05ea-abc5-000000000211 30583 1726853674.97464: done sending task result for task 02083763-bbaf-05ea-abc5-000000000211 30583 1726853674.97467: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853674.97519: no more pending results, returning what we have 30583 1726853674.97522: results queue empty 30583 1726853674.97523: checking for any_errors_fatal 30583 1726853674.97529: done checking for any_errors_fatal 30583 1726853674.97530: checking for max_fail_percentage 30583 1726853674.97532: done checking for max_fail_percentage 30583 1726853674.97533: checking to see if all hosts have failed and the running result is not ok 30583 1726853674.97534: done checking to see if all hosts have failed 30583 1726853674.97534: getting the remaining hosts for this loop 30583 1726853674.97536: done getting the remaining hosts for this loop 30583 1726853674.97540: getting the next task for host managed_node2 30583 1726853674.97548: done getting next task for host managed_node2 30583 1726853674.97553: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853674.97560: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853674.97577: getting variables 30583 1726853674.97579: in VariableManager get_vars() 30583 1726853674.97615: Calling all_inventory to load vars for managed_node2 30583 1726853674.97618: Calling groups_inventory to load vars for managed_node2 30583 1726853674.97621: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853674.97631: Calling all_plugins_play to load vars for managed_node2 30583 1726853674.97634: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853674.97637: Calling groups_plugins_play to load vars for managed_node2 30583 1726853674.99318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853675.01053: done with get_vars() 30583 1726853675.01081: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853675.01170: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:34:35 -0400 (0:00:00.107) 0:00:10.349 ****** 30583 1726853675.01215: entering _queue_task() for managed_node2/yum 30583 1726853675.01217: Creating lock for yum 30583 1726853675.01587: worker is 1 (out of 1 available) 30583 1726853675.01600: exiting _queue_task() for managed_node2/yum 30583 1726853675.01613: done queuing things up, now waiting for results queue to drain 30583 1726853675.01614: waiting for pending results... 30583 1726853675.01889: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853675.02178: in run() - task 02083763-bbaf-05ea-abc5-000000000212 30583 1726853675.02182: variable 'ansible_search_path' from source: unknown 30583 1726853675.02185: variable 'ansible_search_path' from source: unknown 30583 1726853675.02191: calling self._execute() 30583 1726853675.02194: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853675.02196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853675.02199: variable 'omit' from source: magic vars 30583 1726853675.02467: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.02484: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853675.02641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853675.04769: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853675.04849: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853675.04897: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853675.04935: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853675.04969: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853675.05045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.05078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.05102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.05138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.05153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.05251: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.05277: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853675.05286: when evaluation is False, skipping this task 30583 1726853675.05293: _execute() done 30583 1726853675.05299: dumping result to json 30583 1726853675.05307: done dumping result, returning 30583 1726853675.05319: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000212] 30583 1726853675.05328: sending task result for task 02083763-bbaf-05ea-abc5-000000000212 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853675.05477: no more pending results, returning what we have 30583 1726853675.05480: results queue empty 30583 1726853675.05481: checking for any_errors_fatal 30583 1726853675.05487: done checking for any_errors_fatal 30583 1726853675.05488: checking for max_fail_percentage 30583 1726853675.05490: done checking for max_fail_percentage 30583 1726853675.05491: checking to see if all hosts have failed and the running result is not ok 30583 1726853675.05492: done checking to see if all hosts have failed 30583 1726853675.05493: getting the remaining hosts for this loop 30583 1726853675.05494: done getting the remaining hosts for this loop 30583 1726853675.05498: getting the next task for host managed_node2 30583 1726853675.05507: done getting next task for host managed_node2 30583 1726853675.05511: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853675.05516: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853675.05531: getting variables 30583 1726853675.05533: in VariableManager get_vars() 30583 1726853675.05579: Calling all_inventory to load vars for managed_node2 30583 1726853675.05582: Calling groups_inventory to load vars for managed_node2 30583 1726853675.05584: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853675.05594: Calling all_plugins_play to load vars for managed_node2 30583 1726853675.05597: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853675.05600: Calling groups_plugins_play to load vars for managed_node2 30583 1726853675.06386: done sending task result for task 02083763-bbaf-05ea-abc5-000000000212 30583 1726853675.06389: WORKER PROCESS EXITING 30583 1726853675.07214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853675.08804: done with get_vars() 30583 1726853675.08826: done getting variables 30583 1726853675.08888: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:34:35 -0400 (0:00:00.077) 0:00:10.426 ****** 30583 1726853675.08918: entering _queue_task() for managed_node2/fail 30583 1726853675.09218: worker is 1 (out of 1 available) 30583 1726853675.09232: exiting _queue_task() for managed_node2/fail 30583 1726853675.09245: done queuing things up, now waiting for results queue to drain 30583 1726853675.09247: waiting for pending results... 30583 1726853675.09542: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853675.09697: in run() - task 02083763-bbaf-05ea-abc5-000000000213 30583 1726853675.09719: variable 'ansible_search_path' from source: unknown 30583 1726853675.09727: variable 'ansible_search_path' from source: unknown 30583 1726853675.09774: calling self._execute() 30583 1726853675.09872: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853675.09886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853675.09900: variable 'omit' from source: magic vars 30583 1726853675.10276: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.10292: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853675.10415: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853675.10619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853675.13091: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853675.13167: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853675.13209: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853675.13267: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853675.13299: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853675.13382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.13416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.13444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.13496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.13514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.13563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.13595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.13622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.13665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.13688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.13729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.13757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.13789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.13832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.13849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.14022: variable 'network_connections' from source: include params 30583 1726853675.14176: variable 'interface' from source: play vars 30583 1726853675.14179: variable 'interface' from source: play vars 30583 1726853675.14193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853675.14362: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853675.14408: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853675.14442: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853675.14478: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853675.14528: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853675.14552: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853675.14586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.14619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853675.14684: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853675.14936: variable 'network_connections' from source: include params 30583 1726853675.14951: variable 'interface' from source: play vars 30583 1726853675.15012: variable 'interface' from source: play vars 30583 1726853675.15044: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853675.15058: when evaluation is False, skipping this task 30583 1726853675.15066: _execute() done 30583 1726853675.15072: dumping result to json 30583 1726853675.15079: done dumping result, returning 30583 1726853675.15088: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000213] 30583 1726853675.15164: sending task result for task 02083763-bbaf-05ea-abc5-000000000213 30583 1726853675.15245: done sending task result for task 02083763-bbaf-05ea-abc5-000000000213 30583 1726853675.15247: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853675.15324: no more pending results, returning what we have 30583 1726853675.15328: results queue empty 30583 1726853675.15329: checking for any_errors_fatal 30583 1726853675.15335: done checking for any_errors_fatal 30583 1726853675.15336: checking for max_fail_percentage 30583 1726853675.15339: done checking for max_fail_percentage 30583 1726853675.15340: checking to see if all hosts have failed and the running result is not ok 30583 1726853675.15340: done checking to see if all hosts have failed 30583 1726853675.15341: getting the remaining hosts for this loop 30583 1726853675.15343: done getting the remaining hosts for this loop 30583 1726853675.15348: getting the next task for host managed_node2 30583 1726853675.15359: done getting next task for host managed_node2 30583 1726853675.15363: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853675.15369: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853675.15386: getting variables 30583 1726853675.15388: in VariableManager get_vars() 30583 1726853675.15433: Calling all_inventory to load vars for managed_node2 30583 1726853675.15437: Calling groups_inventory to load vars for managed_node2 30583 1726853675.15439: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853675.15450: Calling all_plugins_play to load vars for managed_node2 30583 1726853675.15453: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853675.15459: Calling groups_plugins_play to load vars for managed_node2 30583 1726853675.17180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853675.18699: done with get_vars() 30583 1726853675.18723: done getting variables 30583 1726853675.18785: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:34:35 -0400 (0:00:00.098) 0:00:10.525 ****** 30583 1726853675.18819: entering _queue_task() for managed_node2/package 30583 1726853675.19149: worker is 1 (out of 1 available) 30583 1726853675.19164: exiting _queue_task() for managed_node2/package 30583 1726853675.19378: done queuing things up, now waiting for results queue to drain 30583 1726853675.19380: waiting for pending results... 30583 1726853675.19509: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853675.19596: in run() - task 02083763-bbaf-05ea-abc5-000000000214 30583 1726853675.19777: variable 'ansible_search_path' from source: unknown 30583 1726853675.19780: variable 'ansible_search_path' from source: unknown 30583 1726853675.19782: calling self._execute() 30583 1726853675.19785: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853675.19787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853675.19789: variable 'omit' from source: magic vars 30583 1726853675.20127: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.20143: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853675.20339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853675.20611: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853675.20665: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853675.20704: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853675.20740: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853675.20858: variable 'network_packages' from source: role '' defaults 30583 1726853675.20968: variable '__network_provider_setup' from source: role '' defaults 30583 1726853675.20989: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853675.21059: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853675.21076: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853675.21142: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853675.21302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853675.23344: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853675.23419: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853675.23479: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853675.23519: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853675.23586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853675.23639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.23675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.23711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.23753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.23803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.23829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.23859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.23891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.23977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.23980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.24199: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853675.24302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.24318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.24334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.24367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.24379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.24445: variable 'ansible_python' from source: facts 30583 1726853675.24463: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853675.24520: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853675.24577: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853675.24656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.24679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.24696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.24720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.24730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.24763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.24786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.24802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.24826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.24836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.24937: variable 'network_connections' from source: include params 30583 1726853675.24940: variable 'interface' from source: play vars 30583 1726853675.25016: variable 'interface' from source: play vars 30583 1726853675.25068: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853675.25089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853675.25114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.25134: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853675.25174: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853675.25350: variable 'network_connections' from source: include params 30583 1726853675.25354: variable 'interface' from source: play vars 30583 1726853675.25426: variable 'interface' from source: play vars 30583 1726853675.25467: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853675.25521: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853675.25718: variable 'network_connections' from source: include params 30583 1726853675.25722: variable 'interface' from source: play vars 30583 1726853675.25769: variable 'interface' from source: play vars 30583 1726853675.25788: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853675.25840: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853675.26123: variable 'network_connections' from source: include params 30583 1726853675.26126: variable 'interface' from source: play vars 30583 1726853675.26252: variable 'interface' from source: play vars 30583 1726853675.26255: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853675.26279: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853675.26477: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853675.26480: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853675.26552: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853675.27019: variable 'network_connections' from source: include params 30583 1726853675.27023: variable 'interface' from source: play vars 30583 1726853675.27085: variable 'interface' from source: play vars 30583 1726853675.27099: variable 'ansible_distribution' from source: facts 30583 1726853675.27102: variable '__network_rh_distros' from source: role '' defaults 30583 1726853675.27105: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.27141: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853675.27267: variable 'ansible_distribution' from source: facts 30583 1726853675.27270: variable '__network_rh_distros' from source: role '' defaults 30583 1726853675.27275: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.27284: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853675.27396: variable 'ansible_distribution' from source: facts 30583 1726853675.27399: variable '__network_rh_distros' from source: role '' defaults 30583 1726853675.27404: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.27428: variable 'network_provider' from source: set_fact 30583 1726853675.27440: variable 'ansible_facts' from source: unknown 30583 1726853675.27873: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853675.27877: when evaluation is False, skipping this task 30583 1726853675.27879: _execute() done 30583 1726853675.27882: dumping result to json 30583 1726853675.27884: done dumping result, returning 30583 1726853675.27891: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-000000000214] 30583 1726853675.27897: sending task result for task 02083763-bbaf-05ea-abc5-000000000214 30583 1726853675.27982: done sending task result for task 02083763-bbaf-05ea-abc5-000000000214 30583 1726853675.27985: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853675.28052: no more pending results, returning what we have 30583 1726853675.28056: results queue empty 30583 1726853675.28057: checking for any_errors_fatal 30583 1726853675.28063: done checking for any_errors_fatal 30583 1726853675.28064: checking for max_fail_percentage 30583 1726853675.28066: done checking for max_fail_percentage 30583 1726853675.28067: checking to see if all hosts have failed and the running result is not ok 30583 1726853675.28067: done checking to see if all hosts have failed 30583 1726853675.28068: getting the remaining hosts for this loop 30583 1726853675.28070: done getting the remaining hosts for this loop 30583 1726853675.28075: getting the next task for host managed_node2 30583 1726853675.28083: done getting next task for host managed_node2 30583 1726853675.28087: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853675.28091: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853675.28106: getting variables 30583 1726853675.28107: in VariableManager get_vars() 30583 1726853675.28146: Calling all_inventory to load vars for managed_node2 30583 1726853675.28148: Calling groups_inventory to load vars for managed_node2 30583 1726853675.28150: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853675.28159: Calling all_plugins_play to load vars for managed_node2 30583 1726853675.28161: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853675.28164: Calling groups_plugins_play to load vars for managed_node2 30583 1726853675.28989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853675.30566: done with get_vars() 30583 1726853675.30591: done getting variables 30583 1726853675.30645: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:34:35 -0400 (0:00:00.118) 0:00:10.644 ****** 30583 1726853675.30685: entering _queue_task() for managed_node2/package 30583 1726853675.30926: worker is 1 (out of 1 available) 30583 1726853675.30941: exiting _queue_task() for managed_node2/package 30583 1726853675.30953: done queuing things up, now waiting for results queue to drain 30583 1726853675.30957: waiting for pending results... 30583 1726853675.31123: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853675.31207: in run() - task 02083763-bbaf-05ea-abc5-000000000215 30583 1726853675.31218: variable 'ansible_search_path' from source: unknown 30583 1726853675.31221: variable 'ansible_search_path' from source: unknown 30583 1726853675.31250: calling self._execute() 30583 1726853675.31318: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853675.31322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853675.31331: variable 'omit' from source: magic vars 30583 1726853675.31582: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.31591: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853675.31673: variable 'network_state' from source: role '' defaults 30583 1726853675.31681: Evaluated conditional (network_state != {}): False 30583 1726853675.31684: when evaluation is False, skipping this task 30583 1726853675.31687: _execute() done 30583 1726853675.31690: dumping result to json 30583 1726853675.31692: done dumping result, returning 30583 1726853675.31701: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000000215] 30583 1726853675.31704: sending task result for task 02083763-bbaf-05ea-abc5-000000000215 30583 1726853675.31791: done sending task result for task 02083763-bbaf-05ea-abc5-000000000215 30583 1726853675.31793: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853675.31879: no more pending results, returning what we have 30583 1726853675.31883: results queue empty 30583 1726853675.31883: checking for any_errors_fatal 30583 1726853675.31887: done checking for any_errors_fatal 30583 1726853675.31888: checking for max_fail_percentage 30583 1726853675.31889: done checking for max_fail_percentage 30583 1726853675.31890: checking to see if all hosts have failed and the running result is not ok 30583 1726853675.31891: done checking to see if all hosts have failed 30583 1726853675.31892: getting the remaining hosts for this loop 30583 1726853675.31893: done getting the remaining hosts for this loop 30583 1726853675.31896: getting the next task for host managed_node2 30583 1726853675.31903: done getting next task for host managed_node2 30583 1726853675.31906: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853675.31910: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853675.31923: getting variables 30583 1726853675.31925: in VariableManager get_vars() 30583 1726853675.31952: Calling all_inventory to load vars for managed_node2 30583 1726853675.31957: Calling groups_inventory to load vars for managed_node2 30583 1726853675.31959: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853675.31967: Calling all_plugins_play to load vars for managed_node2 30583 1726853675.31969: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853675.31974: Calling groups_plugins_play to load vars for managed_node2 30583 1726853675.35785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853675.36883: done with get_vars() 30583 1726853675.36903: done getting variables 30583 1726853675.36948: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:34:35 -0400 (0:00:00.062) 0:00:10.707 ****** 30583 1726853675.36981: entering _queue_task() for managed_node2/package 30583 1726853675.37348: worker is 1 (out of 1 available) 30583 1726853675.37360: exiting _queue_task() for managed_node2/package 30583 1726853675.37373: done queuing things up, now waiting for results queue to drain 30583 1726853675.37374: waiting for pending results... 30583 1726853675.37531: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853675.37631: in run() - task 02083763-bbaf-05ea-abc5-000000000216 30583 1726853675.37642: variable 'ansible_search_path' from source: unknown 30583 1726853675.37645: variable 'ansible_search_path' from source: unknown 30583 1726853675.37680: calling self._execute() 30583 1726853675.37747: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853675.37751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853675.37763: variable 'omit' from source: magic vars 30583 1726853675.38033: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.38040: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853675.38122: variable 'network_state' from source: role '' defaults 30583 1726853675.38130: Evaluated conditional (network_state != {}): False 30583 1726853675.38133: when evaluation is False, skipping this task 30583 1726853675.38138: _execute() done 30583 1726853675.38144: dumping result to json 30583 1726853675.38146: done dumping result, returning 30583 1726853675.38149: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000000216] 30583 1726853675.38152: sending task result for task 02083763-bbaf-05ea-abc5-000000000216 30583 1726853675.38244: done sending task result for task 02083763-bbaf-05ea-abc5-000000000216 30583 1726853675.38248: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853675.38297: no more pending results, returning what we have 30583 1726853675.38301: results queue empty 30583 1726853675.38302: checking for any_errors_fatal 30583 1726853675.38310: done checking for any_errors_fatal 30583 1726853675.38310: checking for max_fail_percentage 30583 1726853675.38312: done checking for max_fail_percentage 30583 1726853675.38313: checking to see if all hosts have failed and the running result is not ok 30583 1726853675.38313: done checking to see if all hosts have failed 30583 1726853675.38314: getting the remaining hosts for this loop 30583 1726853675.38316: done getting the remaining hosts for this loop 30583 1726853675.38319: getting the next task for host managed_node2 30583 1726853675.38325: done getting next task for host managed_node2 30583 1726853675.38329: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853675.38334: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853675.38349: getting variables 30583 1726853675.38351: in VariableManager get_vars() 30583 1726853675.38390: Calling all_inventory to load vars for managed_node2 30583 1726853675.38392: Calling groups_inventory to load vars for managed_node2 30583 1726853675.38395: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853675.38402: Calling all_plugins_play to load vars for managed_node2 30583 1726853675.38404: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853675.38407: Calling groups_plugins_play to load vars for managed_node2 30583 1726853675.39159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853675.40118: done with get_vars() 30583 1726853675.40132: done getting variables 30583 1726853675.40204: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:34:35 -0400 (0:00:00.032) 0:00:10.739 ****** 30583 1726853675.40226: entering _queue_task() for managed_node2/service 30583 1726853675.40228: Creating lock for service 30583 1726853675.40438: worker is 1 (out of 1 available) 30583 1726853675.40452: exiting _queue_task() for managed_node2/service 30583 1726853675.40467: done queuing things up, now waiting for results queue to drain 30583 1726853675.40468: waiting for pending results... 30583 1726853675.40634: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853675.40719: in run() - task 02083763-bbaf-05ea-abc5-000000000217 30583 1726853675.40729: variable 'ansible_search_path' from source: unknown 30583 1726853675.40733: variable 'ansible_search_path' from source: unknown 30583 1726853675.40762: calling self._execute() 30583 1726853675.40829: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853675.40834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853675.40841: variable 'omit' from source: magic vars 30583 1726853675.41105: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.41113: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853675.41196: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853675.41321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853675.42768: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853675.42826: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853675.42852: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853675.42888: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853675.42909: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853675.42965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.43099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.43103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.43106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.43109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.43111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.43113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.43115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.43128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.43138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.43167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.43185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.43206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.43228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.43238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.43349: variable 'network_connections' from source: include params 30583 1726853675.43362: variable 'interface' from source: play vars 30583 1726853675.43411: variable 'interface' from source: play vars 30583 1726853675.43463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853675.43581: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853675.43608: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853675.43629: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853675.43653: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853675.43687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853675.43702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853675.43719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.43736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853675.43785: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853675.43938: variable 'network_connections' from source: include params 30583 1726853675.43941: variable 'interface' from source: play vars 30583 1726853675.43990: variable 'interface' from source: play vars 30583 1726853675.44014: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853675.44017: when evaluation is False, skipping this task 30583 1726853675.44020: _execute() done 30583 1726853675.44022: dumping result to json 30583 1726853675.44024: done dumping result, returning 30583 1726853675.44031: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000217] 30583 1726853675.44036: sending task result for task 02083763-bbaf-05ea-abc5-000000000217 30583 1726853675.44117: done sending task result for task 02083763-bbaf-05ea-abc5-000000000217 30583 1726853675.44126: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853675.44168: no more pending results, returning what we have 30583 1726853675.44174: results queue empty 30583 1726853675.44175: checking for any_errors_fatal 30583 1726853675.44184: done checking for any_errors_fatal 30583 1726853675.44184: checking for max_fail_percentage 30583 1726853675.44186: done checking for max_fail_percentage 30583 1726853675.44187: checking to see if all hosts have failed and the running result is not ok 30583 1726853675.44188: done checking to see if all hosts have failed 30583 1726853675.44188: getting the remaining hosts for this loop 30583 1726853675.44190: done getting the remaining hosts for this loop 30583 1726853675.44193: getting the next task for host managed_node2 30583 1726853675.44200: done getting next task for host managed_node2 30583 1726853675.44204: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853675.44208: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853675.44221: getting variables 30583 1726853675.44223: in VariableManager get_vars() 30583 1726853675.44257: Calling all_inventory to load vars for managed_node2 30583 1726853675.44260: Calling groups_inventory to load vars for managed_node2 30583 1726853675.44262: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853675.44273: Calling all_plugins_play to load vars for managed_node2 30583 1726853675.44275: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853675.44278: Calling groups_plugins_play to load vars for managed_node2 30583 1726853675.45084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853675.45956: done with get_vars() 30583 1726853675.45974: done getting variables 30583 1726853675.46017: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:34:35 -0400 (0:00:00.058) 0:00:10.797 ****** 30583 1726853675.46039: entering _queue_task() for managed_node2/service 30583 1726853675.46268: worker is 1 (out of 1 available) 30583 1726853675.46282: exiting _queue_task() for managed_node2/service 30583 1726853675.46294: done queuing things up, now waiting for results queue to drain 30583 1726853675.46295: waiting for pending results... 30583 1726853675.46469: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853675.46554: in run() - task 02083763-bbaf-05ea-abc5-000000000218 30583 1726853675.46565: variable 'ansible_search_path' from source: unknown 30583 1726853675.46568: variable 'ansible_search_path' from source: unknown 30583 1726853675.46599: calling self._execute() 30583 1726853675.46668: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853675.46673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853675.46682: variable 'omit' from source: magic vars 30583 1726853675.46938: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.46947: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853675.47059: variable 'network_provider' from source: set_fact 30583 1726853675.47063: variable 'network_state' from source: role '' defaults 30583 1726853675.47070: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853675.47078: variable 'omit' from source: magic vars 30583 1726853675.47119: variable 'omit' from source: magic vars 30583 1726853675.47138: variable 'network_service_name' from source: role '' defaults 30583 1726853675.47192: variable 'network_service_name' from source: role '' defaults 30583 1726853675.47262: variable '__network_provider_setup' from source: role '' defaults 30583 1726853675.47265: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853675.47314: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853675.47321: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853675.47366: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853675.47513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853675.48918: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853675.49219: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853675.49247: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853675.49275: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853675.49296: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853675.49351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.49377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.49395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.49420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.49431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.49465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.49485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.49501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.49525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.49535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.49678: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853675.49751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.49768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.49787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.49815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.49825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.49888: variable 'ansible_python' from source: facts 30583 1726853675.49901: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853675.49959: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853675.50012: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853675.50096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.50113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.50132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.50159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.50167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.50201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853675.50220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853675.50240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.50264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853675.50276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853675.50363: variable 'network_connections' from source: include params 30583 1726853675.50369: variable 'interface' from source: play vars 30583 1726853675.50421: variable 'interface' from source: play vars 30583 1726853675.50496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853675.50613: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853675.50661: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853675.50694: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853675.50722: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853675.50764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853675.50790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853675.50812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853675.50834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853675.50873: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853675.51044: variable 'network_connections' from source: include params 30583 1726853675.51048: variable 'interface' from source: play vars 30583 1726853675.51104: variable 'interface' from source: play vars 30583 1726853675.51138: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853675.51194: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853675.51380: variable 'network_connections' from source: include params 30583 1726853675.51383: variable 'interface' from source: play vars 30583 1726853675.51436: variable 'interface' from source: play vars 30583 1726853675.51453: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853675.51507: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853675.51693: variable 'network_connections' from source: include params 30583 1726853675.51696: variable 'interface' from source: play vars 30583 1726853675.51745: variable 'interface' from source: play vars 30583 1726853675.51790: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853675.51830: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853675.51836: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853675.51883: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853675.52012: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853675.52313: variable 'network_connections' from source: include params 30583 1726853675.52317: variable 'interface' from source: play vars 30583 1726853675.52360: variable 'interface' from source: play vars 30583 1726853675.52365: variable 'ansible_distribution' from source: facts 30583 1726853675.52368: variable '__network_rh_distros' from source: role '' defaults 30583 1726853675.52376: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.52402: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853675.52511: variable 'ansible_distribution' from source: facts 30583 1726853675.52515: variable '__network_rh_distros' from source: role '' defaults 30583 1726853675.52518: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.52525: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853675.52633: variable 'ansible_distribution' from source: facts 30583 1726853675.52642: variable '__network_rh_distros' from source: role '' defaults 30583 1726853675.52647: variable 'ansible_distribution_major_version' from source: facts 30583 1726853675.52674: variable 'network_provider' from source: set_fact 30583 1726853675.52691: variable 'omit' from source: magic vars 30583 1726853675.52713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853675.52737: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853675.52751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853675.52764: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853675.52776: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853675.52796: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853675.52799: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853675.52801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853675.52870: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853675.52877: Set connection var ansible_timeout to 10 30583 1726853675.52879: Set connection var ansible_connection to ssh 30583 1726853675.52884: Set connection var ansible_shell_executable to /bin/sh 30583 1726853675.52886: Set connection var ansible_shell_type to sh 30583 1726853675.52894: Set connection var ansible_pipelining to False 30583 1726853675.52913: variable 'ansible_shell_executable' from source: unknown 30583 1726853675.52916: variable 'ansible_connection' from source: unknown 30583 1726853675.52919: variable 'ansible_module_compression' from source: unknown 30583 1726853675.52922: variable 'ansible_shell_type' from source: unknown 30583 1726853675.52925: variable 'ansible_shell_executable' from source: unknown 30583 1726853675.52927: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853675.52929: variable 'ansible_pipelining' from source: unknown 30583 1726853675.52931: variable 'ansible_timeout' from source: unknown 30583 1726853675.52936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853675.53007: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853675.53015: variable 'omit' from source: magic vars 30583 1726853675.53021: starting attempt loop 30583 1726853675.53023: running the handler 30583 1726853675.53082: variable 'ansible_facts' from source: unknown 30583 1726853675.53480: _low_level_execute_command(): starting 30583 1726853675.53487: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853675.53992: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853675.53995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853675.53998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853675.54000: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853675.54003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853675.54054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853675.54057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853675.54059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853675.54143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853675.55890: stdout chunk (state=3): >>>/root <<< 30583 1726853675.55985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853675.56016: stderr chunk (state=3): >>><<< 30583 1726853675.56019: stdout chunk (state=3): >>><<< 30583 1726853675.56037: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853675.56046: _low_level_execute_command(): starting 30583 1726853675.56053: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713 `" && echo ansible-tmp-1726853675.5603666-31085-256473766758713="` echo /root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713 `" ) && sleep 0' 30583 1726853675.56547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853675.56550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853675.56552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853675.56554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853675.56556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853675.56607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853675.56621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853675.56692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853675.58752: stdout chunk (state=3): >>>ansible-tmp-1726853675.5603666-31085-256473766758713=/root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713 <<< 30583 1726853675.58894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853675.58916: stderr chunk (state=3): >>><<< 30583 1726853675.59043: stdout chunk (state=3): >>><<< 30583 1726853675.59047: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853675.5603666-31085-256473766758713=/root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853675.59049: variable 'ansible_module_compression' from source: unknown 30583 1726853675.59111: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 30583 1726853675.59115: ANSIBALLZ: Acquiring lock 30583 1726853675.59117: ANSIBALLZ: Lock acquired: 139827455545936 30583 1726853675.59119: ANSIBALLZ: Creating module 30583 1726853675.88978: ANSIBALLZ: Writing module into payload 30583 1726853675.89050: ANSIBALLZ: Writing module 30583 1726853675.89088: ANSIBALLZ: Renaming module 30583 1726853675.89099: ANSIBALLZ: Done creating module 30583 1726853675.89141: variable 'ansible_facts' from source: unknown 30583 1726853675.89374: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713/AnsiballZ_systemd.py 30583 1726853675.89608: Sending initial data 30583 1726853675.89621: Sent initial data (156 bytes) 30583 1726853675.90141: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853675.90184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853675.90259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853675.90282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853675.90297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853675.90410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853675.92116: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853675.92209: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853675.92290: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpquz9aogq /root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713/AnsiballZ_systemd.py <<< 30583 1726853675.92301: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713/AnsiballZ_systemd.py" <<< 30583 1726853675.92351: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpquz9aogq" to remote "/root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713/AnsiballZ_systemd.py" <<< 30583 1726853675.93988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853675.94000: stdout chunk (state=3): >>><<< 30583 1726853675.94012: stderr chunk (state=3): >>><<< 30583 1726853675.94087: done transferring module to remote 30583 1726853675.94090: _low_level_execute_command(): starting 30583 1726853675.94093: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713/ /root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713/AnsiballZ_systemd.py && sleep 0' 30583 1726853675.94693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853675.94709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853675.94746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853675.94762: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853675.94851: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853675.94876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853675.94898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853675.94993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853675.96931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853675.96934: stdout chunk (state=3): >>><<< 30583 1726853675.96941: stderr chunk (state=3): >>><<< 30583 1726853675.97076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853675.97080: _low_level_execute_command(): starting 30583 1726853675.97083: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713/AnsiballZ_systemd.py && sleep 0' 30583 1726853675.97588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853675.97597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853675.97607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853675.97621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853675.97641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853675.97647: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853675.97685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853675.97750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853675.97765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853675.97806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853675.97887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853676.27816: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4562944", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3319463936", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1749832000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853676.29820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853676.29845: stderr chunk (state=3): >>><<< 30583 1726853676.29848: stdout chunk (state=3): >>><<< 30583 1726853676.29865: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4562944", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3319463936", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1749832000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853676.29987: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853676.30077: _low_level_execute_command(): starting 30583 1726853676.30081: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853675.5603666-31085-256473766758713/ > /dev/null 2>&1 && sleep 0' 30583 1726853676.30837: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853676.30840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853676.30843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853676.30845: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853676.30847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853676.30849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853676.30903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853676.30917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853676.30995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853676.32952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853676.32969: stdout chunk (state=3): >>><<< 30583 1726853676.32984: stderr chunk (state=3): >>><<< 30583 1726853676.33003: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853676.33176: handler run complete 30583 1726853676.33180: attempt loop complete, returning result 30583 1726853676.33182: _execute() done 30583 1726853676.33184: dumping result to json 30583 1726853676.33186: done dumping result, returning 30583 1726853676.33188: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-000000000218] 30583 1726853676.33190: sending task result for task 02083763-bbaf-05ea-abc5-000000000218 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853676.33694: no more pending results, returning what we have 30583 1726853676.33698: results queue empty 30583 1726853676.33699: checking for any_errors_fatal 30583 1726853676.33705: done checking for any_errors_fatal 30583 1726853676.33706: checking for max_fail_percentage 30583 1726853676.33708: done checking for max_fail_percentage 30583 1726853676.33709: checking to see if all hosts have failed and the running result is not ok 30583 1726853676.33710: done checking to see if all hosts have failed 30583 1726853676.33710: getting the remaining hosts for this loop 30583 1726853676.33712: done getting the remaining hosts for this loop 30583 1726853676.33716: getting the next task for host managed_node2 30583 1726853676.33725: done getting next task for host managed_node2 30583 1726853676.33729: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853676.33848: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853676.33861: getting variables 30583 1726853676.33863: in VariableManager get_vars() 30583 1726853676.33899: Calling all_inventory to load vars for managed_node2 30583 1726853676.33902: Calling groups_inventory to load vars for managed_node2 30583 1726853676.33905: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853676.33915: Calling all_plugins_play to load vars for managed_node2 30583 1726853676.33918: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853676.33921: Calling groups_plugins_play to load vars for managed_node2 30583 1726853676.34495: done sending task result for task 02083763-bbaf-05ea-abc5-000000000218 30583 1726853676.34498: WORKER PROCESS EXITING 30583 1726853676.35766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853676.37386: done with get_vars() 30583 1726853676.37407: done getting variables 30583 1726853676.37468: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:34:36 -0400 (0:00:00.914) 0:00:11.712 ****** 30583 1726853676.37506: entering _queue_task() for managed_node2/service 30583 1726853676.37817: worker is 1 (out of 1 available) 30583 1726853676.37830: exiting _queue_task() for managed_node2/service 30583 1726853676.37843: done queuing things up, now waiting for results queue to drain 30583 1726853676.37844: waiting for pending results... 30583 1726853676.38125: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853676.38269: in run() - task 02083763-bbaf-05ea-abc5-000000000219 30583 1726853676.38293: variable 'ansible_search_path' from source: unknown 30583 1726853676.38306: variable 'ansible_search_path' from source: unknown 30583 1726853676.38346: calling self._execute() 30583 1726853676.38439: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853676.38475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853676.38478: variable 'omit' from source: magic vars 30583 1726853676.38856: variable 'ansible_distribution_major_version' from source: facts 30583 1726853676.38879: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853676.39091: variable 'network_provider' from source: set_fact 30583 1726853676.39094: Evaluated conditional (network_provider == "nm"): True 30583 1726853676.39158: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853676.39290: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853676.39470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853676.41861: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853676.41963: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853676.42012: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853676.42051: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853676.42076: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853676.42157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853676.42179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853676.42196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853676.42222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853676.42233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853676.42274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853676.42290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853676.42307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853676.42331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853676.42341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853676.42374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853676.42391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853676.42406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853676.42430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853676.42440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853676.42538: variable 'network_connections' from source: include params 30583 1726853676.42548: variable 'interface' from source: play vars 30583 1726853676.42606: variable 'interface' from source: play vars 30583 1726853676.42656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853676.42766: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853676.42797: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853676.42819: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853676.42841: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853676.42875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853676.42894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853676.42914: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853676.42929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853676.42966: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853676.43128: variable 'network_connections' from source: include params 30583 1726853676.43132: variable 'interface' from source: play vars 30583 1726853676.43178: variable 'interface' from source: play vars 30583 1726853676.43206: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853676.43210: when evaluation is False, skipping this task 30583 1726853676.43212: _execute() done 30583 1726853676.43216: dumping result to json 30583 1726853676.43218: done dumping result, returning 30583 1726853676.43226: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-000000000219] 30583 1726853676.43237: sending task result for task 02083763-bbaf-05ea-abc5-000000000219 30583 1726853676.43318: done sending task result for task 02083763-bbaf-05ea-abc5-000000000219 30583 1726853676.43320: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853676.43394: no more pending results, returning what we have 30583 1726853676.43397: results queue empty 30583 1726853676.43398: checking for any_errors_fatal 30583 1726853676.43425: done checking for any_errors_fatal 30583 1726853676.43426: checking for max_fail_percentage 30583 1726853676.43428: done checking for max_fail_percentage 30583 1726853676.43429: checking to see if all hosts have failed and the running result is not ok 30583 1726853676.43429: done checking to see if all hosts have failed 30583 1726853676.43430: getting the remaining hosts for this loop 30583 1726853676.43433: done getting the remaining hosts for this loop 30583 1726853676.43438: getting the next task for host managed_node2 30583 1726853676.43445: done getting next task for host managed_node2 30583 1726853676.43448: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853676.43453: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853676.43465: getting variables 30583 1726853676.43467: in VariableManager get_vars() 30583 1726853676.43499: Calling all_inventory to load vars for managed_node2 30583 1726853676.43501: Calling groups_inventory to load vars for managed_node2 30583 1726853676.43503: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853676.43511: Calling all_plugins_play to load vars for managed_node2 30583 1726853676.43514: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853676.43516: Calling groups_plugins_play to load vars for managed_node2 30583 1726853676.44766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853676.45734: done with get_vars() 30583 1726853676.45751: done getting variables 30583 1726853676.45795: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:34:36 -0400 (0:00:00.083) 0:00:11.795 ****** 30583 1726853676.45819: entering _queue_task() for managed_node2/service 30583 1726853676.46048: worker is 1 (out of 1 available) 30583 1726853676.46060: exiting _queue_task() for managed_node2/service 30583 1726853676.46074: done queuing things up, now waiting for results queue to drain 30583 1726853676.46076: waiting for pending results... 30583 1726853676.46255: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853676.46342: in run() - task 02083763-bbaf-05ea-abc5-00000000021a 30583 1726853676.46353: variable 'ansible_search_path' from source: unknown 30583 1726853676.46356: variable 'ansible_search_path' from source: unknown 30583 1726853676.46389: calling self._execute() 30583 1726853676.46456: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853676.46463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853676.46474: variable 'omit' from source: magic vars 30583 1726853676.46977: variable 'ansible_distribution_major_version' from source: facts 30583 1726853676.46981: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853676.46983: variable 'network_provider' from source: set_fact 30583 1726853676.46985: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853676.46987: when evaluation is False, skipping this task 30583 1726853676.46989: _execute() done 30583 1726853676.46991: dumping result to json 30583 1726853676.46992: done dumping result, returning 30583 1726853676.46994: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-00000000021a] 30583 1726853676.46996: sending task result for task 02083763-bbaf-05ea-abc5-00000000021a 30583 1726853676.47077: done sending task result for task 02083763-bbaf-05ea-abc5-00000000021a 30583 1726853676.47085: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853676.47173: no more pending results, returning what we have 30583 1726853676.47178: results queue empty 30583 1726853676.47179: checking for any_errors_fatal 30583 1726853676.47190: done checking for any_errors_fatal 30583 1726853676.47190: checking for max_fail_percentage 30583 1726853676.47193: done checking for max_fail_percentage 30583 1726853676.47193: checking to see if all hosts have failed and the running result is not ok 30583 1726853676.47194: done checking to see if all hosts have failed 30583 1726853676.47195: getting the remaining hosts for this loop 30583 1726853676.47197: done getting the remaining hosts for this loop 30583 1726853676.47201: getting the next task for host managed_node2 30583 1726853676.47209: done getting next task for host managed_node2 30583 1726853676.47213: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853676.47219: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853676.47242: getting variables 30583 1726853676.47244: in VariableManager get_vars() 30583 1726853676.47286: Calling all_inventory to load vars for managed_node2 30583 1726853676.47289: Calling groups_inventory to load vars for managed_node2 30583 1726853676.47291: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853676.47304: Calling all_plugins_play to load vars for managed_node2 30583 1726853676.47306: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853676.47308: Calling groups_plugins_play to load vars for managed_node2 30583 1726853676.48469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853676.49321: done with get_vars() 30583 1726853676.49337: done getting variables 30583 1726853676.49382: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:34:36 -0400 (0:00:00.035) 0:00:11.831 ****** 30583 1726853676.49407: entering _queue_task() for managed_node2/copy 30583 1726853676.49629: worker is 1 (out of 1 available) 30583 1726853676.49644: exiting _queue_task() for managed_node2/copy 30583 1726853676.49657: done queuing things up, now waiting for results queue to drain 30583 1726853676.49658: waiting for pending results... 30583 1726853676.49910: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853676.50040: in run() - task 02083763-bbaf-05ea-abc5-00000000021b 30583 1726853676.50065: variable 'ansible_search_path' from source: unknown 30583 1726853676.50081: variable 'ansible_search_path' from source: unknown 30583 1726853676.50123: calling self._execute() 30583 1726853676.50217: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853676.50234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853676.50250: variable 'omit' from source: magic vars 30583 1726853676.50622: variable 'ansible_distribution_major_version' from source: facts 30583 1726853676.50775: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853676.50779: variable 'network_provider' from source: set_fact 30583 1726853676.50781: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853676.50783: when evaluation is False, skipping this task 30583 1726853676.50785: _execute() done 30583 1726853676.50787: dumping result to json 30583 1726853676.50789: done dumping result, returning 30583 1726853676.50793: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-00000000021b] 30583 1726853676.50795: sending task result for task 02083763-bbaf-05ea-abc5-00000000021b 30583 1726853676.50898: done sending task result for task 02083763-bbaf-05ea-abc5-00000000021b 30583 1726853676.50902: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853676.50952: no more pending results, returning what we have 30583 1726853676.50956: results queue empty 30583 1726853676.50957: checking for any_errors_fatal 30583 1726853676.50967: done checking for any_errors_fatal 30583 1726853676.50968: checking for max_fail_percentage 30583 1726853676.50973: done checking for max_fail_percentage 30583 1726853676.50974: checking to see if all hosts have failed and the running result is not ok 30583 1726853676.50975: done checking to see if all hosts have failed 30583 1726853676.50975: getting the remaining hosts for this loop 30583 1726853676.50977: done getting the remaining hosts for this loop 30583 1726853676.50982: getting the next task for host managed_node2 30583 1726853676.50990: done getting next task for host managed_node2 30583 1726853676.50994: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853676.51001: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853676.51017: getting variables 30583 1726853676.51019: in VariableManager get_vars() 30583 1726853676.51058: Calling all_inventory to load vars for managed_node2 30583 1726853676.51062: Calling groups_inventory to load vars for managed_node2 30583 1726853676.51064: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853676.51280: Calling all_plugins_play to load vars for managed_node2 30583 1726853676.51289: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853676.51293: Calling groups_plugins_play to load vars for managed_node2 30583 1726853676.52408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853676.53263: done with get_vars() 30583 1726853676.53280: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:34:36 -0400 (0:00:00.039) 0:00:11.870 ****** 30583 1726853676.53340: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853676.53341: Creating lock for fedora.linux_system_roles.network_connections 30583 1726853676.53651: worker is 1 (out of 1 available) 30583 1726853676.53668: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853676.53882: done queuing things up, now waiting for results queue to drain 30583 1726853676.53884: waiting for pending results... 30583 1726853676.54011: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853676.54121: in run() - task 02083763-bbaf-05ea-abc5-00000000021c 30583 1726853676.54142: variable 'ansible_search_path' from source: unknown 30583 1726853676.54218: variable 'ansible_search_path' from source: unknown 30583 1726853676.54222: calling self._execute() 30583 1726853676.54294: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853676.54308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853676.54326: variable 'omit' from source: magic vars 30583 1726853676.54651: variable 'ansible_distribution_major_version' from source: facts 30583 1726853676.54665: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853676.54669: variable 'omit' from source: magic vars 30583 1726853676.54709: variable 'omit' from source: magic vars 30583 1726853676.54818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853676.56244: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853676.56295: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853676.56322: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853676.56347: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853676.56370: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853676.56428: variable 'network_provider' from source: set_fact 30583 1726853676.56524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853676.56543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853676.56563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853676.56590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853676.56601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853676.56656: variable 'omit' from source: magic vars 30583 1726853676.56731: variable 'omit' from source: magic vars 30583 1726853676.56813: variable 'network_connections' from source: include params 30583 1726853676.56822: variable 'interface' from source: play vars 30583 1726853676.56873: variable 'interface' from source: play vars 30583 1726853676.56980: variable 'omit' from source: magic vars 30583 1726853676.56987: variable '__lsr_ansible_managed' from source: task vars 30583 1726853676.57028: variable '__lsr_ansible_managed' from source: task vars 30583 1726853676.57149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853676.57294: Loaded config def from plugin (lookup/template) 30583 1726853676.57298: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853676.57318: File lookup term: get_ansible_managed.j2 30583 1726853676.57320: variable 'ansible_search_path' from source: unknown 30583 1726853676.57329: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853676.57340: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853676.57353: variable 'ansible_search_path' from source: unknown 30583 1726853676.60667: variable 'ansible_managed' from source: unknown 30583 1726853676.60745: variable 'omit' from source: magic vars 30583 1726853676.60768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853676.60791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853676.60805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853676.60817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853676.60825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853676.60848: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853676.60851: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853676.60854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853676.60917: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853676.60920: Set connection var ansible_timeout to 10 30583 1726853676.60923: Set connection var ansible_connection to ssh 30583 1726853676.60929: Set connection var ansible_shell_executable to /bin/sh 30583 1726853676.60931: Set connection var ansible_shell_type to sh 30583 1726853676.60943: Set connection var ansible_pipelining to False 30583 1726853676.60962: variable 'ansible_shell_executable' from source: unknown 30583 1726853676.60965: variable 'ansible_connection' from source: unknown 30583 1726853676.60967: variable 'ansible_module_compression' from source: unknown 30583 1726853676.60969: variable 'ansible_shell_type' from source: unknown 30583 1726853676.60973: variable 'ansible_shell_executable' from source: unknown 30583 1726853676.60976: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853676.60978: variable 'ansible_pipelining' from source: unknown 30583 1726853676.60982: variable 'ansible_timeout' from source: unknown 30583 1726853676.60985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853676.61075: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853676.61086: variable 'omit' from source: magic vars 30583 1726853676.61089: starting attempt loop 30583 1726853676.61092: running the handler 30583 1726853676.61102: _low_level_execute_command(): starting 30583 1726853676.61108: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853676.61598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853676.61602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853676.61604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853676.61606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853676.61661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853676.61664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853676.61670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853676.61748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853676.63512: stdout chunk (state=3): >>>/root <<< 30583 1726853676.63609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853676.63640: stderr chunk (state=3): >>><<< 30583 1726853676.63644: stdout chunk (state=3): >>><<< 30583 1726853676.63665: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853676.63677: _low_level_execute_command(): starting 30583 1726853676.63684: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674 `" && echo ansible-tmp-1726853676.6366494-31123-129070676074674="` echo /root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674 `" ) && sleep 0' 30583 1726853676.64136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853676.64139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853676.64142: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853676.64144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853676.64146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853676.64148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853676.64206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853676.64210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853676.64213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853676.64278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853676.66265: stdout chunk (state=3): >>>ansible-tmp-1726853676.6366494-31123-129070676074674=/root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674 <<< 30583 1726853676.66376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853676.66402: stderr chunk (state=3): >>><<< 30583 1726853676.66405: stdout chunk (state=3): >>><<< 30583 1726853676.66420: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853676.6366494-31123-129070676074674=/root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853676.66456: variable 'ansible_module_compression' from source: unknown 30583 1726853676.66500: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 30583 1726853676.66503: ANSIBALLZ: Acquiring lock 30583 1726853676.66506: ANSIBALLZ: Lock acquired: 139827455420624 30583 1726853676.66508: ANSIBALLZ: Creating module 30583 1726853676.83963: ANSIBALLZ: Writing module into payload 30583 1726853676.84182: ANSIBALLZ: Writing module 30583 1726853676.84207: ANSIBALLZ: Renaming module 30583 1726853676.84213: ANSIBALLZ: Done creating module 30583 1726853676.84252: variable 'ansible_facts' from source: unknown 30583 1726853676.84348: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674/AnsiballZ_network_connections.py 30583 1726853676.84568: Sending initial data 30583 1726853676.84575: Sent initial data (168 bytes) 30583 1726853676.85317: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853676.85321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853676.85324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853676.85326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853676.85328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853676.85331: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853676.85333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853676.85335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853676.85337: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853676.85339: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853676.85341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853676.85343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853676.85345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853676.85348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853676.85426: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853676.85430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853676.85433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853676.85435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853676.85512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853676.85625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853676.87327: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853676.87414: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853676.87511: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpcncj5enu /root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674/AnsiballZ_network_connections.py <<< 30583 1726853676.87521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674/AnsiballZ_network_connections.py" <<< 30583 1726853676.87584: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpcncj5enu" to remote "/root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674/AnsiballZ_network_connections.py" <<< 30583 1726853676.88898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853676.88902: stdout chunk (state=3): >>><<< 30583 1726853676.88904: stderr chunk (state=3): >>><<< 30583 1726853676.88914: done transferring module to remote 30583 1726853676.88929: _low_level_execute_command(): starting 30583 1726853676.88938: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674/ /root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674/AnsiballZ_network_connections.py && sleep 0' 30583 1726853676.89535: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853676.89548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853676.89584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853676.89601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853676.89691: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853676.89714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853676.90000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853676.91980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853676.91984: stdout chunk (state=3): >>><<< 30583 1726853676.91986: stderr chunk (state=3): >>><<< 30583 1726853676.91989: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853676.91997: _low_level_execute_command(): starting 30583 1726853676.92008: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674/AnsiballZ_network_connections.py && sleep 0' 30583 1726853676.92622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853676.92640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853676.92659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853676.92711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853676.92733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853676.92792: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853676.92866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853676.92895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853676.92937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853676.93052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853677.21718: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, cd4fb572-41c5-436a-affc-f73b867bbd77\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30583 1726853677.26020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853677.26024: stdout chunk (state=3): >>><<< 30583 1726853677.26027: stderr chunk (state=3): >>><<< 30583 1726853677.26052: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, cd4fb572-41c5-436a-affc-f73b867bbd77\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853677.26196: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853677.26200: _low_level_execute_command(): starting 30583 1726853677.26202: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853676.6366494-31123-129070676074674/ > /dev/null 2>&1 && sleep 0' 30583 1726853677.26734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853677.26784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853677.26844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853677.26862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853677.26870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853677.26984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853677.29033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853677.29044: stderr chunk (state=3): >>><<< 30583 1726853677.29052: stdout chunk (state=3): >>><<< 30583 1726853677.29077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853677.29276: handler run complete 30583 1726853677.29280: attempt loop complete, returning result 30583 1726853677.29282: _execute() done 30583 1726853677.29284: dumping result to json 30583 1726853677.29286: done dumping result, returning 30583 1726853677.29288: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-00000000021c] 30583 1726853677.29290: sending task result for task 02083763-bbaf-05ea-abc5-00000000021c 30583 1726853677.29366: done sending task result for task 02083763-bbaf-05ea-abc5-00000000021c 30583 1726853677.29370: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, cd4fb572-41c5-436a-affc-f73b867bbd77 30583 1726853677.29479: no more pending results, returning what we have 30583 1726853677.29483: results queue empty 30583 1726853677.29484: checking for any_errors_fatal 30583 1726853677.29492: done checking for any_errors_fatal 30583 1726853677.29492: checking for max_fail_percentage 30583 1726853677.29495: done checking for max_fail_percentage 30583 1726853677.29496: checking to see if all hosts have failed and the running result is not ok 30583 1726853677.29496: done checking to see if all hosts have failed 30583 1726853677.29497: getting the remaining hosts for this loop 30583 1726853677.29499: done getting the remaining hosts for this loop 30583 1726853677.29503: getting the next task for host managed_node2 30583 1726853677.29512: done getting next task for host managed_node2 30583 1726853677.29516: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853677.29522: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853677.29534: getting variables 30583 1726853677.29536: in VariableManager get_vars() 30583 1726853677.29692: Calling all_inventory to load vars for managed_node2 30583 1726853677.29695: Calling groups_inventory to load vars for managed_node2 30583 1726853677.29698: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853677.29707: Calling all_plugins_play to load vars for managed_node2 30583 1726853677.29710: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853677.29712: Calling groups_plugins_play to load vars for managed_node2 30583 1726853677.31397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853677.33002: done with get_vars() 30583 1726853677.33028: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:34:37 -0400 (0:00:00.797) 0:00:12.668 ****** 30583 1726853677.33133: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853677.33135: Creating lock for fedora.linux_system_roles.network_state 30583 1726853677.33602: worker is 1 (out of 1 available) 30583 1726853677.33616: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853677.33627: done queuing things up, now waiting for results queue to drain 30583 1726853677.33629: waiting for pending results... 30583 1726853677.33920: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853677.33988: in run() - task 02083763-bbaf-05ea-abc5-00000000021d 30583 1726853677.34000: variable 'ansible_search_path' from source: unknown 30583 1726853677.34004: variable 'ansible_search_path' from source: unknown 30583 1726853677.34033: calling self._execute() 30583 1726853677.34104: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853677.34107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853677.34124: variable 'omit' from source: magic vars 30583 1726853677.34402: variable 'ansible_distribution_major_version' from source: facts 30583 1726853677.34411: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853677.34494: variable 'network_state' from source: role '' defaults 30583 1726853677.34505: Evaluated conditional (network_state != {}): False 30583 1726853677.34508: when evaluation is False, skipping this task 30583 1726853677.34511: _execute() done 30583 1726853677.34513: dumping result to json 30583 1726853677.34516: done dumping result, returning 30583 1726853677.34526: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-00000000021d] 30583 1726853677.34529: sending task result for task 02083763-bbaf-05ea-abc5-00000000021d 30583 1726853677.34607: done sending task result for task 02083763-bbaf-05ea-abc5-00000000021d 30583 1726853677.34609: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853677.34674: no more pending results, returning what we have 30583 1726853677.34678: results queue empty 30583 1726853677.34679: checking for any_errors_fatal 30583 1726853677.34692: done checking for any_errors_fatal 30583 1726853677.34692: checking for max_fail_percentage 30583 1726853677.34694: done checking for max_fail_percentage 30583 1726853677.34695: checking to see if all hosts have failed and the running result is not ok 30583 1726853677.34696: done checking to see if all hosts have failed 30583 1726853677.34697: getting the remaining hosts for this loop 30583 1726853677.34699: done getting the remaining hosts for this loop 30583 1726853677.34703: getting the next task for host managed_node2 30583 1726853677.34710: done getting next task for host managed_node2 30583 1726853677.34713: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853677.34720: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853677.34733: getting variables 30583 1726853677.34734: in VariableManager get_vars() 30583 1726853677.34762: Calling all_inventory to load vars for managed_node2 30583 1726853677.34765: Calling groups_inventory to load vars for managed_node2 30583 1726853677.34767: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853677.34776: Calling all_plugins_play to load vars for managed_node2 30583 1726853677.34778: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853677.34781: Calling groups_plugins_play to load vars for managed_node2 30583 1726853677.35638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853677.36986: done with get_vars() 30583 1726853677.37003: done getting variables 30583 1726853677.37047: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:34:37 -0400 (0:00:00.039) 0:00:12.707 ****** 30583 1726853677.37075: entering _queue_task() for managed_node2/debug 30583 1726853677.37301: worker is 1 (out of 1 available) 30583 1726853677.37315: exiting _queue_task() for managed_node2/debug 30583 1726853677.37327: done queuing things up, now waiting for results queue to drain 30583 1726853677.37328: waiting for pending results... 30583 1726853677.37513: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853677.37600: in run() - task 02083763-bbaf-05ea-abc5-00000000021e 30583 1726853677.37612: variable 'ansible_search_path' from source: unknown 30583 1726853677.37616: variable 'ansible_search_path' from source: unknown 30583 1726853677.37646: calling self._execute() 30583 1726853677.37714: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853677.37717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853677.37726: variable 'omit' from source: magic vars 30583 1726853677.38001: variable 'ansible_distribution_major_version' from source: facts 30583 1726853677.38011: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853677.38017: variable 'omit' from source: magic vars 30583 1726853677.38057: variable 'omit' from source: magic vars 30583 1726853677.38084: variable 'omit' from source: magic vars 30583 1726853677.38121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853677.38147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853677.38165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853677.38179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853677.38190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853677.38215: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853677.38218: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853677.38220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853677.38292: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853677.38296: Set connection var ansible_timeout to 10 30583 1726853677.38300: Set connection var ansible_connection to ssh 30583 1726853677.38306: Set connection var ansible_shell_executable to /bin/sh 30583 1726853677.38308: Set connection var ansible_shell_type to sh 30583 1726853677.38320: Set connection var ansible_pipelining to False 30583 1726853677.38336: variable 'ansible_shell_executable' from source: unknown 30583 1726853677.38339: variable 'ansible_connection' from source: unknown 30583 1726853677.38341: variable 'ansible_module_compression' from source: unknown 30583 1726853677.38344: variable 'ansible_shell_type' from source: unknown 30583 1726853677.38346: variable 'ansible_shell_executable' from source: unknown 30583 1726853677.38348: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853677.38352: variable 'ansible_pipelining' from source: unknown 30583 1726853677.38354: variable 'ansible_timeout' from source: unknown 30583 1726853677.38361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853677.38460: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853677.38472: variable 'omit' from source: magic vars 30583 1726853677.38478: starting attempt loop 30583 1726853677.38481: running the handler 30583 1726853677.38576: variable '__network_connections_result' from source: set_fact 30583 1726853677.38615: handler run complete 30583 1726853677.38628: attempt loop complete, returning result 30583 1726853677.38631: _execute() done 30583 1726853677.38636: dumping result to json 30583 1726853677.38639: done dumping result, returning 30583 1726853677.38649: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-00000000021e] 30583 1726853677.38651: sending task result for task 02083763-bbaf-05ea-abc5-00000000021e 30583 1726853677.38760: done sending task result for task 02083763-bbaf-05ea-abc5-00000000021e 30583 1726853677.38763: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, cd4fb572-41c5-436a-affc-f73b867bbd77" ] } 30583 1726853677.38850: no more pending results, returning what we have 30583 1726853677.38855: results queue empty 30583 1726853677.38856: checking for any_errors_fatal 30583 1726853677.38864: done checking for any_errors_fatal 30583 1726853677.38865: checking for max_fail_percentage 30583 1726853677.38866: done checking for max_fail_percentage 30583 1726853677.38867: checking to see if all hosts have failed and the running result is not ok 30583 1726853677.38868: done checking to see if all hosts have failed 30583 1726853677.38869: getting the remaining hosts for this loop 30583 1726853677.38873: done getting the remaining hosts for this loop 30583 1726853677.38877: getting the next task for host managed_node2 30583 1726853677.38883: done getting next task for host managed_node2 30583 1726853677.38887: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853677.38912: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853677.38923: getting variables 30583 1726853677.39001: in VariableManager get_vars() 30583 1726853677.39035: Calling all_inventory to load vars for managed_node2 30583 1726853677.39038: Calling groups_inventory to load vars for managed_node2 30583 1726853677.39040: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853677.39048: Calling all_plugins_play to load vars for managed_node2 30583 1726853677.39050: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853677.39052: Calling groups_plugins_play to load vars for managed_node2 30583 1726853677.40391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853677.41265: done with get_vars() 30583 1726853677.41286: done getting variables 30583 1726853677.41328: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:34:37 -0400 (0:00:00.042) 0:00:12.750 ****** 30583 1726853677.41359: entering _queue_task() for managed_node2/debug 30583 1726853677.41594: worker is 1 (out of 1 available) 30583 1726853677.41608: exiting _queue_task() for managed_node2/debug 30583 1726853677.41621: done queuing things up, now waiting for results queue to drain 30583 1726853677.41623: waiting for pending results... 30583 1726853677.41802: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853677.41918: in run() - task 02083763-bbaf-05ea-abc5-00000000021f 30583 1726853677.41945: variable 'ansible_search_path' from source: unknown 30583 1726853677.41950: variable 'ansible_search_path' from source: unknown 30583 1726853677.41993: calling self._execute() 30583 1726853677.42177: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853677.42180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853677.42182: variable 'omit' from source: magic vars 30583 1726853677.42456: variable 'ansible_distribution_major_version' from source: facts 30583 1726853677.42474: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853677.42486: variable 'omit' from source: magic vars 30583 1726853677.42553: variable 'omit' from source: magic vars 30583 1726853677.42594: variable 'omit' from source: magic vars 30583 1726853677.42643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853677.42678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853677.42715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853677.42719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853677.42731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853677.42774: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853677.42777: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853677.42780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853677.42938: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853677.42940: Set connection var ansible_timeout to 10 30583 1726853677.42943: Set connection var ansible_connection to ssh 30583 1726853677.42945: Set connection var ansible_shell_executable to /bin/sh 30583 1726853677.42947: Set connection var ansible_shell_type to sh 30583 1726853677.42949: Set connection var ansible_pipelining to False 30583 1726853677.42951: variable 'ansible_shell_executable' from source: unknown 30583 1726853677.42953: variable 'ansible_connection' from source: unknown 30583 1726853677.42981: variable 'ansible_module_compression' from source: unknown 30583 1726853677.42989: variable 'ansible_shell_type' from source: unknown 30583 1726853677.42996: variable 'ansible_shell_executable' from source: unknown 30583 1726853677.43002: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853677.43009: variable 'ansible_pipelining' from source: unknown 30583 1726853677.43039: variable 'ansible_timeout' from source: unknown 30583 1726853677.43042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853677.43195: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853677.43260: variable 'omit' from source: magic vars 30583 1726853677.43263: starting attempt loop 30583 1726853677.43265: running the handler 30583 1726853677.43288: variable '__network_connections_result' from source: set_fact 30583 1726853677.43380: variable '__network_connections_result' from source: set_fact 30583 1726853677.43504: handler run complete 30583 1726853677.43535: attempt loop complete, returning result 30583 1726853677.43560: _execute() done 30583 1726853677.43665: dumping result to json 30583 1726853677.43668: done dumping result, returning 30583 1726853677.43670: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-00000000021f] 30583 1726853677.43681: sending task result for task 02083763-bbaf-05ea-abc5-00000000021f ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, cd4fb572-41c5-436a-affc-f73b867bbd77\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, cd4fb572-41c5-436a-affc-f73b867bbd77" ] } } 30583 1726853677.43820: no more pending results, returning what we have 30583 1726853677.43824: results queue empty 30583 1726853677.43825: checking for any_errors_fatal 30583 1726853677.43833: done checking for any_errors_fatal 30583 1726853677.43834: checking for max_fail_percentage 30583 1726853677.43836: done checking for max_fail_percentage 30583 1726853677.43837: checking to see if all hosts have failed and the running result is not ok 30583 1726853677.43838: done checking to see if all hosts have failed 30583 1726853677.43838: getting the remaining hosts for this loop 30583 1726853677.43840: done getting the remaining hosts for this loop 30583 1726853677.43844: getting the next task for host managed_node2 30583 1726853677.43851: done getting next task for host managed_node2 30583 1726853677.43854: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853677.43858: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853677.43868: getting variables 30583 1726853677.43869: in VariableManager get_vars() 30583 1726853677.43898: Calling all_inventory to load vars for managed_node2 30583 1726853677.43901: Calling groups_inventory to load vars for managed_node2 30583 1726853677.43907: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853677.43915: Calling all_plugins_play to load vars for managed_node2 30583 1726853677.43917: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853677.43920: Calling groups_plugins_play to load vars for managed_node2 30583 1726853677.44488: done sending task result for task 02083763-bbaf-05ea-abc5-00000000021f 30583 1726853677.44492: WORKER PROCESS EXITING 30583 1726853677.44795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853677.45636: done with get_vars() 30583 1726853677.45652: done getting variables 30583 1726853677.45695: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:34:37 -0400 (0:00:00.043) 0:00:12.794 ****** 30583 1726853677.45719: entering _queue_task() for managed_node2/debug 30583 1726853677.46198: worker is 1 (out of 1 available) 30583 1726853677.46208: exiting _queue_task() for managed_node2/debug 30583 1726853677.46217: done queuing things up, now waiting for results queue to drain 30583 1726853677.46218: waiting for pending results... 30583 1726853677.46345: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853677.46459: in run() - task 02083763-bbaf-05ea-abc5-000000000220 30583 1726853677.46551: variable 'ansible_search_path' from source: unknown 30583 1726853677.46555: variable 'ansible_search_path' from source: unknown 30583 1726853677.46559: calling self._execute() 30583 1726853677.46634: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853677.46647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853677.46664: variable 'omit' from source: magic vars 30583 1726853677.46986: variable 'ansible_distribution_major_version' from source: facts 30583 1726853677.46998: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853677.47099: variable 'network_state' from source: role '' defaults 30583 1726853677.47108: Evaluated conditional (network_state != {}): False 30583 1726853677.47111: when evaluation is False, skipping this task 30583 1726853677.47113: _execute() done 30583 1726853677.47116: dumping result to json 30583 1726853677.47120: done dumping result, returning 30583 1726853677.47128: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-000000000220] 30583 1726853677.47133: sending task result for task 02083763-bbaf-05ea-abc5-000000000220 30583 1726853677.47217: done sending task result for task 02083763-bbaf-05ea-abc5-000000000220 30583 1726853677.47220: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853677.47267: no more pending results, returning what we have 30583 1726853677.47273: results queue empty 30583 1726853677.47274: checking for any_errors_fatal 30583 1726853677.47285: done checking for any_errors_fatal 30583 1726853677.47286: checking for max_fail_percentage 30583 1726853677.47288: done checking for max_fail_percentage 30583 1726853677.47288: checking to see if all hosts have failed and the running result is not ok 30583 1726853677.47289: done checking to see if all hosts have failed 30583 1726853677.47290: getting the remaining hosts for this loop 30583 1726853677.47292: done getting the remaining hosts for this loop 30583 1726853677.47295: getting the next task for host managed_node2 30583 1726853677.47302: done getting next task for host managed_node2 30583 1726853677.47305: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853677.47309: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853677.47322: getting variables 30583 1726853677.47324: in VariableManager get_vars() 30583 1726853677.47360: Calling all_inventory to load vars for managed_node2 30583 1726853677.47362: Calling groups_inventory to load vars for managed_node2 30583 1726853677.47364: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853677.47374: Calling all_plugins_play to load vars for managed_node2 30583 1726853677.47376: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853677.47378: Calling groups_plugins_play to load vars for managed_node2 30583 1726853677.48133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853677.49075: done with get_vars() 30583 1726853677.49089: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:34:37 -0400 (0:00:00.034) 0:00:12.828 ****** 30583 1726853677.49153: entering _queue_task() for managed_node2/ping 30583 1726853677.49154: Creating lock for ping 30583 1726853677.49378: worker is 1 (out of 1 available) 30583 1726853677.49393: exiting _queue_task() for managed_node2/ping 30583 1726853677.49405: done queuing things up, now waiting for results queue to drain 30583 1726853677.49406: waiting for pending results... 30583 1726853677.49588: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853677.49671: in run() - task 02083763-bbaf-05ea-abc5-000000000221 30583 1726853677.49683: variable 'ansible_search_path' from source: unknown 30583 1726853677.49687: variable 'ansible_search_path' from source: unknown 30583 1726853677.49715: calling self._execute() 30583 1726853677.49786: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853677.49790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853677.49799: variable 'omit' from source: magic vars 30583 1726853677.50076: variable 'ansible_distribution_major_version' from source: facts 30583 1726853677.50084: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853677.50093: variable 'omit' from source: magic vars 30583 1726853677.50133: variable 'omit' from source: magic vars 30583 1726853677.50155: variable 'omit' from source: magic vars 30583 1726853677.50192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853677.50218: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853677.50234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853677.50248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853677.50260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853677.50290: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853677.50293: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853677.50296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853677.50358: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853677.50365: Set connection var ansible_timeout to 10 30583 1726853677.50368: Set connection var ansible_connection to ssh 30583 1726853677.50374: Set connection var ansible_shell_executable to /bin/sh 30583 1726853677.50377: Set connection var ansible_shell_type to sh 30583 1726853677.50385: Set connection var ansible_pipelining to False 30583 1726853677.50406: variable 'ansible_shell_executable' from source: unknown 30583 1726853677.50409: variable 'ansible_connection' from source: unknown 30583 1726853677.50411: variable 'ansible_module_compression' from source: unknown 30583 1726853677.50414: variable 'ansible_shell_type' from source: unknown 30583 1726853677.50416: variable 'ansible_shell_executable' from source: unknown 30583 1726853677.50418: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853677.50420: variable 'ansible_pipelining' from source: unknown 30583 1726853677.50424: variable 'ansible_timeout' from source: unknown 30583 1726853677.50427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853677.50576: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853677.50584: variable 'omit' from source: magic vars 30583 1726853677.50590: starting attempt loop 30583 1726853677.50592: running the handler 30583 1726853677.50603: _low_level_execute_command(): starting 30583 1726853677.50611: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853677.51131: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853677.51135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853677.51138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853677.51140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853677.51200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853677.51208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853677.51211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853677.51282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853677.53036: stdout chunk (state=3): >>>/root <<< 30583 1726853677.53131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853677.53167: stderr chunk (state=3): >>><<< 30583 1726853677.53170: stdout chunk (state=3): >>><<< 30583 1726853677.53193: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853677.53204: _low_level_execute_command(): starting 30583 1726853677.53211: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964 `" && echo ansible-tmp-1726853677.5319295-31166-251968547173964="` echo /root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964 `" ) && sleep 0' 30583 1726853677.53642: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853677.53650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853677.53677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853677.53689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853677.53692: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853677.53710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853677.53744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853677.53747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853677.53749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853677.53830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853677.55872: stdout chunk (state=3): >>>ansible-tmp-1726853677.5319295-31166-251968547173964=/root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964 <<< 30583 1726853677.55980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853677.56008: stderr chunk (state=3): >>><<< 30583 1726853677.56011: stdout chunk (state=3): >>><<< 30583 1726853677.56026: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853677.5319295-31166-251968547173964=/root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853677.56070: variable 'ansible_module_compression' from source: unknown 30583 1726853677.56109: ANSIBALLZ: Using lock for ping 30583 1726853677.56112: ANSIBALLZ: Acquiring lock 30583 1726853677.56115: ANSIBALLZ: Lock acquired: 139827452652784 30583 1726853677.56117: ANSIBALLZ: Creating module 30583 1726853677.64178: ANSIBALLZ: Writing module into payload 30583 1726853677.64182: ANSIBALLZ: Writing module 30583 1726853677.64184: ANSIBALLZ: Renaming module 30583 1726853677.64186: ANSIBALLZ: Done creating module 30583 1726853677.64188: variable 'ansible_facts' from source: unknown 30583 1726853677.64209: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964/AnsiballZ_ping.py 30583 1726853677.64347: Sending initial data 30583 1726853677.64358: Sent initial data (153 bytes) 30583 1726853677.64788: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853677.64804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853677.64815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853677.64857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853677.64870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853677.64958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853677.66666: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853677.66758: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853677.66822: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp_elne7qn /root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964/AnsiballZ_ping.py <<< 30583 1726853677.66842: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964/AnsiballZ_ping.py" <<< 30583 1726853677.66906: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp_elne7qn" to remote "/root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964/AnsiballZ_ping.py" <<< 30583 1726853677.67733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853677.67779: stderr chunk (state=3): >>><<< 30583 1726853677.67790: stdout chunk (state=3): >>><<< 30583 1726853677.67842: done transferring module to remote 30583 1726853677.67856: _low_level_execute_command(): starting 30583 1726853677.67865: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964/ /root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964/AnsiballZ_ping.py && sleep 0' 30583 1726853677.68484: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853677.68503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853677.68518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853677.68534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853677.68622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853677.68653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853677.68670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853677.68692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853677.68790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853677.70754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853677.70828: stdout chunk (state=3): >>><<< 30583 1726853677.70833: stderr chunk (state=3): >>><<< 30583 1726853677.71139: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853677.71142: _low_level_execute_command(): starting 30583 1726853677.71145: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964/AnsiballZ_ping.py && sleep 0' 30583 1726853677.71807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853677.71823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853677.71887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853677.71940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853677.71960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853677.71985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853677.72288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853677.87759: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853677.89245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853677.89252: stdout chunk (state=3): >>><<< 30583 1726853677.89255: stderr chunk (state=3): >>><<< 30583 1726853677.89258: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853677.89261: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853677.89263: _low_level_execute_command(): starting 30583 1726853677.89266: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853677.5319295-31166-251968547173964/ > /dev/null 2>&1 && sleep 0' 30583 1726853677.90633: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853677.90813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853677.90884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853677.92926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853677.92934: stdout chunk (state=3): >>><<< 30583 1726853677.92952: stderr chunk (state=3): >>><<< 30583 1726853677.92969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853677.93157: handler run complete 30583 1726853677.93161: attempt loop complete, returning result 30583 1726853677.93163: _execute() done 30583 1726853677.93165: dumping result to json 30583 1726853677.93167: done dumping result, returning 30583 1726853677.93170: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-000000000221] 30583 1726853677.93175: sending task result for task 02083763-bbaf-05ea-abc5-000000000221 30583 1726853677.93240: done sending task result for task 02083763-bbaf-05ea-abc5-000000000221 30583 1726853677.93244: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853677.93359: no more pending results, returning what we have 30583 1726853677.93362: results queue empty 30583 1726853677.93364: checking for any_errors_fatal 30583 1726853677.93373: done checking for any_errors_fatal 30583 1726853677.93374: checking for max_fail_percentage 30583 1726853677.93377: done checking for max_fail_percentage 30583 1726853677.93377: checking to see if all hosts have failed and the running result is not ok 30583 1726853677.93378: done checking to see if all hosts have failed 30583 1726853677.93379: getting the remaining hosts for this loop 30583 1726853677.93381: done getting the remaining hosts for this loop 30583 1726853677.93385: getting the next task for host managed_node2 30583 1726853677.93396: done getting next task for host managed_node2 30583 1726853677.93398: ^ task is: TASK: meta (role_complete) 30583 1726853677.93405: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853677.93415: getting variables 30583 1726853677.93417: in VariableManager get_vars() 30583 1726853677.93458: Calling all_inventory to load vars for managed_node2 30583 1726853677.93461: Calling groups_inventory to load vars for managed_node2 30583 1726853677.93464: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853677.93773: Calling all_plugins_play to load vars for managed_node2 30583 1726853677.93777: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853677.93781: Calling groups_plugins_play to load vars for managed_node2 30583 1726853677.95867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853677.97393: done with get_vars() 30583 1726853677.97415: done getting variables 30583 1726853677.97593: done queuing things up, now waiting for results queue to drain 30583 1726853677.97595: results queue empty 30583 1726853677.97596: checking for any_errors_fatal 30583 1726853677.97599: done checking for any_errors_fatal 30583 1726853677.97600: checking for max_fail_percentage 30583 1726853677.97601: done checking for max_fail_percentage 30583 1726853677.97602: checking to see if all hosts have failed and the running result is not ok 30583 1726853677.97603: done checking to see if all hosts have failed 30583 1726853677.97604: getting the remaining hosts for this loop 30583 1726853677.97605: done getting the remaining hosts for this loop 30583 1726853677.97608: getting the next task for host managed_node2 30583 1726853677.97613: done getting next task for host managed_node2 30583 1726853677.97615: ^ task is: TASK: Show result 30583 1726853677.97618: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853677.97620: getting variables 30583 1726853677.97621: in VariableManager get_vars() 30583 1726853677.97633: Calling all_inventory to load vars for managed_node2 30583 1726853677.97635: Calling groups_inventory to load vars for managed_node2 30583 1726853677.97638: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853677.97644: Calling all_plugins_play to load vars for managed_node2 30583 1726853677.97646: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853677.97649: Calling groups_plugins_play to load vars for managed_node2 30583 1726853677.99896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853678.02331: done with get_vars() 30583 1726853678.02359: done getting variables 30583 1726853678.02405: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 13:34:38 -0400 (0:00:00.532) 0:00:13.361 ****** 30583 1726853678.02443: entering _queue_task() for managed_node2/debug 30583 1726853678.02858: worker is 1 (out of 1 available) 30583 1726853678.02876: exiting _queue_task() for managed_node2/debug 30583 1726853678.02888: done queuing things up, now waiting for results queue to drain 30583 1726853678.02889: waiting for pending results... 30583 1726853678.03387: running TaskExecutor() for managed_node2/TASK: Show result 30583 1726853678.03511: in run() - task 02083763-bbaf-05ea-abc5-00000000018f 30583 1726853678.03516: variable 'ansible_search_path' from source: unknown 30583 1726853678.03518: variable 'ansible_search_path' from source: unknown 30583 1726853678.03749: calling self._execute() 30583 1726853678.03868: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853678.03884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853678.03899: variable 'omit' from source: magic vars 30583 1726853678.04679: variable 'ansible_distribution_major_version' from source: facts 30583 1726853678.04748: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853678.04761: variable 'omit' from source: magic vars 30583 1726853678.04812: variable 'omit' from source: magic vars 30583 1726853678.04911: variable 'omit' from source: magic vars 30583 1726853678.05019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853678.05070: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853678.05098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853678.05117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853678.05131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853678.05169: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853678.05183: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853678.05190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853678.05375: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853678.05378: Set connection var ansible_timeout to 10 30583 1726853678.05385: Set connection var ansible_connection to ssh 30583 1726853678.05387: Set connection var ansible_shell_executable to /bin/sh 30583 1726853678.05392: Set connection var ansible_shell_type to sh 30583 1726853678.05394: Set connection var ansible_pipelining to False 30583 1726853678.05396: variable 'ansible_shell_executable' from source: unknown 30583 1726853678.05398: variable 'ansible_connection' from source: unknown 30583 1726853678.05400: variable 'ansible_module_compression' from source: unknown 30583 1726853678.05402: variable 'ansible_shell_type' from source: unknown 30583 1726853678.05404: variable 'ansible_shell_executable' from source: unknown 30583 1726853678.05406: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853678.05408: variable 'ansible_pipelining' from source: unknown 30583 1726853678.05416: variable 'ansible_timeout' from source: unknown 30583 1726853678.05423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853678.05574: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853678.05591: variable 'omit' from source: magic vars 30583 1726853678.05609: starting attempt loop 30583 1726853678.05616: running the handler 30583 1726853678.05668: variable '__network_connections_result' from source: set_fact 30583 1726853678.05775: variable '__network_connections_result' from source: set_fact 30583 1726853678.05886: handler run complete 30583 1726853678.05917: attempt loop complete, returning result 30583 1726853678.05974: _execute() done 30583 1726853678.05981: dumping result to json 30583 1726853678.05983: done dumping result, returning 30583 1726853678.05986: done running TaskExecutor() for managed_node2/TASK: Show result [02083763-bbaf-05ea-abc5-00000000018f] 30583 1726853678.05988: sending task result for task 02083763-bbaf-05ea-abc5-00000000018f ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, cd4fb572-41c5-436a-affc-f73b867bbd77\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, cd4fb572-41c5-436a-affc-f73b867bbd77" ] } } 30583 1726853678.06199: no more pending results, returning what we have 30583 1726853678.06203: results queue empty 30583 1726853678.06204: checking for any_errors_fatal 30583 1726853678.06206: done checking for any_errors_fatal 30583 1726853678.06207: checking for max_fail_percentage 30583 1726853678.06209: done checking for max_fail_percentage 30583 1726853678.06210: checking to see if all hosts have failed and the running result is not ok 30583 1726853678.06211: done checking to see if all hosts have failed 30583 1726853678.06211: getting the remaining hosts for this loop 30583 1726853678.06213: done getting the remaining hosts for this loop 30583 1726853678.06217: getting the next task for host managed_node2 30583 1726853678.06227: done getting next task for host managed_node2 30583 1726853678.06233: ^ task is: TASK: Asserts 30583 1726853678.06236: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853678.06242: getting variables 30583 1726853678.06244: in VariableManager get_vars() 30583 1726853678.06383: Calling all_inventory to load vars for managed_node2 30583 1726853678.06387: Calling groups_inventory to load vars for managed_node2 30583 1726853678.06391: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853678.06484: Calling all_plugins_play to load vars for managed_node2 30583 1726853678.06491: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853678.06495: Calling groups_plugins_play to load vars for managed_node2 30583 1726853678.07111: done sending task result for task 02083763-bbaf-05ea-abc5-00000000018f 30583 1726853678.07114: WORKER PROCESS EXITING 30583 1726853678.07924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853678.09535: done with get_vars() 30583 1726853678.09563: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 13:34:38 -0400 (0:00:00.072) 0:00:13.433 ****** 30583 1726853678.09675: entering _queue_task() for managed_node2/include_tasks 30583 1726853678.10026: worker is 1 (out of 1 available) 30583 1726853678.10041: exiting _queue_task() for managed_node2/include_tasks 30583 1726853678.10054: done queuing things up, now waiting for results queue to drain 30583 1726853678.10058: waiting for pending results... 30583 1726853678.10381: running TaskExecutor() for managed_node2/TASK: Asserts 30583 1726853678.10512: in run() - task 02083763-bbaf-05ea-abc5-000000000096 30583 1726853678.10533: variable 'ansible_search_path' from source: unknown 30583 1726853678.10541: variable 'ansible_search_path' from source: unknown 30583 1726853678.10596: variable 'lsr_assert' from source: include params 30583 1726853678.10824: variable 'lsr_assert' from source: include params 30583 1726853678.10901: variable 'omit' from source: magic vars 30583 1726853678.11048: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853678.11064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853678.11078: variable 'omit' from source: magic vars 30583 1726853678.11301: variable 'ansible_distribution_major_version' from source: facts 30583 1726853678.11317: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853678.11326: variable 'item' from source: unknown 30583 1726853678.11395: variable 'item' from source: unknown 30583 1726853678.11429: variable 'item' from source: unknown 30583 1726853678.11496: variable 'item' from source: unknown 30583 1726853678.11782: dumping result to json 30583 1726853678.11785: done dumping result, returning 30583 1726853678.11788: done running TaskExecutor() for managed_node2/TASK: Asserts [02083763-bbaf-05ea-abc5-000000000096] 30583 1726853678.11790: sending task result for task 02083763-bbaf-05ea-abc5-000000000096 30583 1726853678.11835: done sending task result for task 02083763-bbaf-05ea-abc5-000000000096 30583 1726853678.11838: WORKER PROCESS EXITING 30583 1726853678.11908: no more pending results, returning what we have 30583 1726853678.11913: in VariableManager get_vars() 30583 1726853678.11946: Calling all_inventory to load vars for managed_node2 30583 1726853678.11949: Calling groups_inventory to load vars for managed_node2 30583 1726853678.11953: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853678.11969: Calling all_plugins_play to load vars for managed_node2 30583 1726853678.11974: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853678.11978: Calling groups_plugins_play to load vars for managed_node2 30583 1726853678.13596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853678.15122: done with get_vars() 30583 1726853678.15139: variable 'ansible_search_path' from source: unknown 30583 1726853678.15140: variable 'ansible_search_path' from source: unknown 30583 1726853678.15179: we have included files to process 30583 1726853678.15180: generating all_blocks data 30583 1726853678.15182: done generating all_blocks data 30583 1726853678.15187: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30583 1726853678.15188: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30583 1726853678.15190: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30583 1726853678.15384: in VariableManager get_vars() 30583 1726853678.15402: done with get_vars() 30583 1726853678.15679: done processing included file 30583 1726853678.15682: iterating over new_blocks loaded from include file 30583 1726853678.15683: in VariableManager get_vars() 30583 1726853678.15698: done with get_vars() 30583 1726853678.15699: filtering new block on tags 30583 1726853678.15751: done filtering new block on tags 30583 1726853678.15754: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=tasks/assert_profile_present.yml) 30583 1726853678.15765: extending task lists for all hosts with included blocks 30583 1726853678.16845: done extending task lists 30583 1726853678.16847: done processing included files 30583 1726853678.16847: results queue empty 30583 1726853678.16848: checking for any_errors_fatal 30583 1726853678.16852: done checking for any_errors_fatal 30583 1726853678.16852: checking for max_fail_percentage 30583 1726853678.16853: done checking for max_fail_percentage 30583 1726853678.16857: checking to see if all hosts have failed and the running result is not ok 30583 1726853678.16857: done checking to see if all hosts have failed 30583 1726853678.16858: getting the remaining hosts for this loop 30583 1726853678.16860: done getting the remaining hosts for this loop 30583 1726853678.16862: getting the next task for host managed_node2 30583 1726853678.16866: done getting next task for host managed_node2 30583 1726853678.16868: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30583 1726853678.16873: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853678.16875: getting variables 30583 1726853678.16876: in VariableManager get_vars() 30583 1726853678.16885: Calling all_inventory to load vars for managed_node2 30583 1726853678.16887: Calling groups_inventory to load vars for managed_node2 30583 1726853678.16889: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853678.16896: Calling all_plugins_play to load vars for managed_node2 30583 1726853678.16898: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853678.16901: Calling groups_plugins_play to load vars for managed_node2 30583 1726853678.18068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853678.19698: done with get_vars() 30583 1726853678.19718: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:34:38 -0400 (0:00:00.101) 0:00:13.535 ****** 30583 1726853678.19807: entering _queue_task() for managed_node2/include_tasks 30583 1726853678.20186: worker is 1 (out of 1 available) 30583 1726853678.20203: exiting _queue_task() for managed_node2/include_tasks 30583 1726853678.20216: done queuing things up, now waiting for results queue to drain 30583 1726853678.20218: waiting for pending results... 30583 1726853678.20545: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 30583 1726853678.20627: in run() - task 02083763-bbaf-05ea-abc5-000000000383 30583 1726853678.20751: variable 'ansible_search_path' from source: unknown 30583 1726853678.20758: variable 'ansible_search_path' from source: unknown 30583 1726853678.20762: calling self._execute() 30583 1726853678.20809: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853678.20821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853678.20837: variable 'omit' from source: magic vars 30583 1726853678.21250: variable 'ansible_distribution_major_version' from source: facts 30583 1726853678.21269: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853678.21281: _execute() done 30583 1726853678.21288: dumping result to json 30583 1726853678.21294: done dumping result, returning 30583 1726853678.21306: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-05ea-abc5-000000000383] 30583 1726853678.21317: sending task result for task 02083763-bbaf-05ea-abc5-000000000383 30583 1726853678.21599: no more pending results, returning what we have 30583 1726853678.21604: in VariableManager get_vars() 30583 1726853678.21641: Calling all_inventory to load vars for managed_node2 30583 1726853678.21644: Calling groups_inventory to load vars for managed_node2 30583 1726853678.21647: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853678.21660: Calling all_plugins_play to load vars for managed_node2 30583 1726853678.21663: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853678.21665: Calling groups_plugins_play to load vars for managed_node2 30583 1726853678.22195: done sending task result for task 02083763-bbaf-05ea-abc5-000000000383 30583 1726853678.22198: WORKER PROCESS EXITING 30583 1726853678.23460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853678.25198: done with get_vars() 30583 1726853678.25218: variable 'ansible_search_path' from source: unknown 30583 1726853678.25219: variable 'ansible_search_path' from source: unknown 30583 1726853678.25229: variable 'item' from source: include params 30583 1726853678.25339: variable 'item' from source: include params 30583 1726853678.25377: we have included files to process 30583 1726853678.25378: generating all_blocks data 30583 1726853678.25384: done generating all_blocks data 30583 1726853678.25386: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853678.25387: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853678.25389: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853678.27591: done processing included file 30583 1726853678.27594: iterating over new_blocks loaded from include file 30583 1726853678.27595: in VariableManager get_vars() 30583 1726853678.27612: done with get_vars() 30583 1726853678.27613: filtering new block on tags 30583 1726853678.27735: done filtering new block on tags 30583 1726853678.27739: in VariableManager get_vars() 30583 1726853678.27752: done with get_vars() 30583 1726853678.27754: filtering new block on tags 30583 1726853678.27965: done filtering new block on tags 30583 1726853678.27968: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 30583 1726853678.27975: extending task lists for all hosts with included blocks 30583 1726853678.28430: done extending task lists 30583 1726853678.28431: done processing included files 30583 1726853678.28432: results queue empty 30583 1726853678.28433: checking for any_errors_fatal 30583 1726853678.28436: done checking for any_errors_fatal 30583 1726853678.28437: checking for max_fail_percentage 30583 1726853678.28438: done checking for max_fail_percentage 30583 1726853678.28438: checking to see if all hosts have failed and the running result is not ok 30583 1726853678.28439: done checking to see if all hosts have failed 30583 1726853678.28440: getting the remaining hosts for this loop 30583 1726853678.28442: done getting the remaining hosts for this loop 30583 1726853678.28444: getting the next task for host managed_node2 30583 1726853678.28448: done getting next task for host managed_node2 30583 1726853678.28451: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30583 1726853678.28454: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853678.28458: getting variables 30583 1726853678.28459: in VariableManager get_vars() 30583 1726853678.28468: Calling all_inventory to load vars for managed_node2 30583 1726853678.28470: Calling groups_inventory to load vars for managed_node2 30583 1726853678.28675: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853678.28682: Calling all_plugins_play to load vars for managed_node2 30583 1726853678.28684: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853678.28687: Calling groups_plugins_play to load vars for managed_node2 30583 1726853678.40336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853678.42479: done with get_vars() 30583 1726853678.42505: done getting variables 30583 1726853678.42577: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:34:38 -0400 (0:00:00.228) 0:00:13.763 ****** 30583 1726853678.42619: entering _queue_task() for managed_node2/set_fact 30583 1726853678.43113: worker is 1 (out of 1 available) 30583 1726853678.43126: exiting _queue_task() for managed_node2/set_fact 30583 1726853678.43139: done queuing things up, now waiting for results queue to drain 30583 1726853678.43141: waiting for pending results... 30583 1726853678.43370: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 30583 1726853678.43528: in run() - task 02083763-bbaf-05ea-abc5-0000000003fe 30583 1726853678.43551: variable 'ansible_search_path' from source: unknown 30583 1726853678.43562: variable 'ansible_search_path' from source: unknown 30583 1726853678.43610: calling self._execute() 30583 1726853678.43709: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853678.43725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853678.43741: variable 'omit' from source: magic vars 30583 1726853678.44134: variable 'ansible_distribution_major_version' from source: facts 30583 1726853678.44154: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853678.44249: variable 'omit' from source: magic vars 30583 1726853678.44252: variable 'omit' from source: magic vars 30583 1726853678.44282: variable 'omit' from source: magic vars 30583 1726853678.44351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853678.44415: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853678.44439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853678.44500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853678.44702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853678.44706: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853678.44708: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853678.44712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853678.44815: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853678.44837: Set connection var ansible_timeout to 10 30583 1726853678.44861: Set connection var ansible_connection to ssh 30583 1726853678.44879: Set connection var ansible_shell_executable to /bin/sh 30583 1726853678.44887: Set connection var ansible_shell_type to sh 30583 1726853678.44903: Set connection var ansible_pipelining to False 30583 1726853678.44939: variable 'ansible_shell_executable' from source: unknown 30583 1726853678.44947: variable 'ansible_connection' from source: unknown 30583 1726853678.44957: variable 'ansible_module_compression' from source: unknown 30583 1726853678.44965: variable 'ansible_shell_type' from source: unknown 30583 1726853678.45029: variable 'ansible_shell_executable' from source: unknown 30583 1726853678.45033: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853678.45039: variable 'ansible_pipelining' from source: unknown 30583 1726853678.45044: variable 'ansible_timeout' from source: unknown 30583 1726853678.45047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853678.45165: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853678.45184: variable 'omit' from source: magic vars 30583 1726853678.45196: starting attempt loop 30583 1726853678.45202: running the handler 30583 1726853678.45218: handler run complete 30583 1726853678.45234: attempt loop complete, returning result 30583 1726853678.45245: _execute() done 30583 1726853678.45263: dumping result to json 30583 1726853678.45266: done dumping result, returning 30583 1726853678.45357: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-05ea-abc5-0000000003fe] 30583 1726853678.45361: sending task result for task 02083763-bbaf-05ea-abc5-0000000003fe 30583 1726853678.45439: done sending task result for task 02083763-bbaf-05ea-abc5-0000000003fe 30583 1726853678.45442: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30583 1726853678.45506: no more pending results, returning what we have 30583 1726853678.45510: results queue empty 30583 1726853678.45511: checking for any_errors_fatal 30583 1726853678.45512: done checking for any_errors_fatal 30583 1726853678.45513: checking for max_fail_percentage 30583 1726853678.45515: done checking for max_fail_percentage 30583 1726853678.45516: checking to see if all hosts have failed and the running result is not ok 30583 1726853678.45517: done checking to see if all hosts have failed 30583 1726853678.45517: getting the remaining hosts for this loop 30583 1726853678.45520: done getting the remaining hosts for this loop 30583 1726853678.45524: getting the next task for host managed_node2 30583 1726853678.45533: done getting next task for host managed_node2 30583 1726853678.45537: ^ task is: TASK: Stat profile file 30583 1726853678.45542: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853678.45549: getting variables 30583 1726853678.45551: in VariableManager get_vars() 30583 1726853678.45695: Calling all_inventory to load vars for managed_node2 30583 1726853678.45699: Calling groups_inventory to load vars for managed_node2 30583 1726853678.45703: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853678.45714: Calling all_plugins_play to load vars for managed_node2 30583 1726853678.45718: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853678.45721: Calling groups_plugins_play to load vars for managed_node2 30583 1726853678.47301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853678.48991: done with get_vars() 30583 1726853678.49012: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:34:38 -0400 (0:00:00.065) 0:00:13.828 ****** 30583 1726853678.49131: entering _queue_task() for managed_node2/stat 30583 1726853678.49837: worker is 1 (out of 1 available) 30583 1726853678.49850: exiting _queue_task() for managed_node2/stat 30583 1726853678.49868: done queuing things up, now waiting for results queue to drain 30583 1726853678.49869: waiting for pending results... 30583 1726853678.50465: running TaskExecutor() for managed_node2/TASK: Stat profile file 30583 1726853678.50742: in run() - task 02083763-bbaf-05ea-abc5-0000000003ff 30583 1726853678.50747: variable 'ansible_search_path' from source: unknown 30583 1726853678.50749: variable 'ansible_search_path' from source: unknown 30583 1726853678.50752: calling self._execute() 30583 1726853678.50831: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853678.50835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853678.50849: variable 'omit' from source: magic vars 30583 1726853678.51712: variable 'ansible_distribution_major_version' from source: facts 30583 1726853678.51716: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853678.51762: variable 'omit' from source: magic vars 30583 1726853678.51776: variable 'omit' from source: magic vars 30583 1726853678.51867: variable 'profile' from source: play vars 30583 1726853678.51870: variable 'interface' from source: play vars 30583 1726853678.51938: variable 'interface' from source: play vars 30583 1726853678.51959: variable 'omit' from source: magic vars 30583 1726853678.52077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853678.52083: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853678.52086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853678.52089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853678.52092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853678.52111: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853678.52114: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853678.52176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853678.52216: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853678.52224: Set connection var ansible_timeout to 10 30583 1726853678.52226: Set connection var ansible_connection to ssh 30583 1726853678.52231: Set connection var ansible_shell_executable to /bin/sh 30583 1726853678.52234: Set connection var ansible_shell_type to sh 30583 1726853678.52244: Set connection var ansible_pipelining to False 30583 1726853678.52269: variable 'ansible_shell_executable' from source: unknown 30583 1726853678.52274: variable 'ansible_connection' from source: unknown 30583 1726853678.52276: variable 'ansible_module_compression' from source: unknown 30583 1726853678.52279: variable 'ansible_shell_type' from source: unknown 30583 1726853678.52281: variable 'ansible_shell_executable' from source: unknown 30583 1726853678.52284: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853678.52286: variable 'ansible_pipelining' from source: unknown 30583 1726853678.52294: variable 'ansible_timeout' from source: unknown 30583 1726853678.52296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853678.52489: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853678.52494: variable 'omit' from source: magic vars 30583 1726853678.52536: starting attempt loop 30583 1726853678.52539: running the handler 30583 1726853678.52541: _low_level_execute_command(): starting 30583 1726853678.52544: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853678.53306: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853678.53416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853678.53468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853678.53652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853678.55388: stdout chunk (state=3): >>>/root <<< 30583 1726853678.55518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853678.55548: stderr chunk (state=3): >>><<< 30583 1726853678.55577: stdout chunk (state=3): >>><<< 30583 1726853678.55662: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853678.55666: _low_level_execute_command(): starting 30583 1726853678.55670: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118 `" && echo ansible-tmp-1726853678.5559986-31207-106756333326118="` echo /root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118 `" ) && sleep 0' 30583 1726853678.56352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853678.56421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853678.56504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853678.56564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853678.56669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853678.58708: stdout chunk (state=3): >>>ansible-tmp-1726853678.5559986-31207-106756333326118=/root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118 <<< 30583 1726853678.58868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853678.58874: stdout chunk (state=3): >>><<< 30583 1726853678.58877: stderr chunk (state=3): >>><<< 30583 1726853678.59050: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853678.5559986-31207-106756333326118=/root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853678.59054: variable 'ansible_module_compression' from source: unknown 30583 1726853678.59059: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853678.59078: variable 'ansible_facts' from source: unknown 30583 1726853678.59165: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118/AnsiballZ_stat.py 30583 1726853678.59418: Sending initial data 30583 1726853678.59422: Sent initial data (153 bytes) 30583 1726853678.59935: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853678.59945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853678.59952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853678.59970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853678.59981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853678.59989: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853678.59999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853678.60012: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853678.60094: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853678.60135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853678.60203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853678.61886: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853678.61987: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853678.62078: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp5cn6uvdh /root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118/AnsiballZ_stat.py <<< 30583 1726853678.62081: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118/AnsiballZ_stat.py" <<< 30583 1726853678.62133: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp5cn6uvdh" to remote "/root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118/AnsiballZ_stat.py" <<< 30583 1726853678.63113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853678.63117: stdout chunk (state=3): >>><<< 30583 1726853678.63119: stderr chunk (state=3): >>><<< 30583 1726853678.63121: done transferring module to remote 30583 1726853678.63123: _low_level_execute_command(): starting 30583 1726853678.63126: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118/ /root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118/AnsiballZ_stat.py && sleep 0' 30583 1726853678.63668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853678.63686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853678.63709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853678.63821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853678.63842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853678.63864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853678.63982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853678.65894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853678.65904: stdout chunk (state=3): >>><<< 30583 1726853678.65914: stderr chunk (state=3): >>><<< 30583 1726853678.65933: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853678.65940: _low_level_execute_command(): starting 30583 1726853678.65949: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118/AnsiballZ_stat.py && sleep 0' 30583 1726853678.66546: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853678.66563: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853678.66581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853678.66599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853678.66614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853678.66692: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853678.66727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853678.66743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853678.66770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853678.66879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853678.82798: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853678.84341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853678.84345: stdout chunk (state=3): >>><<< 30583 1726853678.84348: stderr chunk (state=3): >>><<< 30583 1726853678.84370: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853678.84492: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853678.84497: _low_level_execute_command(): starting 30583 1726853678.84499: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853678.5559986-31207-106756333326118/ > /dev/null 2>&1 && sleep 0' 30583 1726853678.85577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853678.85778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853678.85781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853678.85784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853678.85786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853678.85789: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853678.85791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853678.85793: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853678.85795: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853678.85798: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853678.85800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853678.85802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853678.85804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853678.85814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853678.85816: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853678.85818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853678.85877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853678.85902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853678.86092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853678.86186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853678.88164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853678.88168: stdout chunk (state=3): >>><<< 30583 1726853678.88175: stderr chunk (state=3): >>><<< 30583 1726853678.88192: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853678.88218: handler run complete 30583 1726853678.88221: attempt loop complete, returning result 30583 1726853678.88224: _execute() done 30583 1726853678.88226: dumping result to json 30583 1726853678.88228: done dumping result, returning 30583 1726853678.88236: done running TaskExecutor() for managed_node2/TASK: Stat profile file [02083763-bbaf-05ea-abc5-0000000003ff] 30583 1726853678.88328: sending task result for task 02083763-bbaf-05ea-abc5-0000000003ff 30583 1726853678.88399: done sending task result for task 02083763-bbaf-05ea-abc5-0000000003ff 30583 1726853678.88402: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30583 1726853678.88507: no more pending results, returning what we have 30583 1726853678.88511: results queue empty 30583 1726853678.88512: checking for any_errors_fatal 30583 1726853678.88518: done checking for any_errors_fatal 30583 1726853678.88519: checking for max_fail_percentage 30583 1726853678.88521: done checking for max_fail_percentage 30583 1726853678.88522: checking to see if all hosts have failed and the running result is not ok 30583 1726853678.88523: done checking to see if all hosts have failed 30583 1726853678.88523: getting the remaining hosts for this loop 30583 1726853678.88525: done getting the remaining hosts for this loop 30583 1726853678.88530: getting the next task for host managed_node2 30583 1726853678.88542: done getting next task for host managed_node2 30583 1726853678.88545: ^ task is: TASK: Set NM profile exist flag based on the profile files 30583 1726853678.88550: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853678.88558: getting variables 30583 1726853678.88560: in VariableManager get_vars() 30583 1726853678.88594: Calling all_inventory to load vars for managed_node2 30583 1726853678.88597: Calling groups_inventory to load vars for managed_node2 30583 1726853678.88600: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853678.88611: Calling all_plugins_play to load vars for managed_node2 30583 1726853678.88613: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853678.88615: Calling groups_plugins_play to load vars for managed_node2 30583 1726853678.91870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853678.95565: done with get_vars() 30583 1726853678.95810: done getting variables 30583 1726853678.95880: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:34:38 -0400 (0:00:00.467) 0:00:14.296 ****** 30583 1726853678.95976: entering _queue_task() for managed_node2/set_fact 30583 1726853678.96657: worker is 1 (out of 1 available) 30583 1726853678.96670: exiting _queue_task() for managed_node2/set_fact 30583 1726853678.96880: done queuing things up, now waiting for results queue to drain 30583 1726853678.96882: waiting for pending results... 30583 1726853678.97462: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 30583 1726853678.97580: in run() - task 02083763-bbaf-05ea-abc5-000000000400 30583 1726853678.97777: variable 'ansible_search_path' from source: unknown 30583 1726853678.97781: variable 'ansible_search_path' from source: unknown 30583 1726853678.97784: calling self._execute() 30583 1726853678.97935: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853678.97995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853678.98016: variable 'omit' from source: magic vars 30583 1726853678.98873: variable 'ansible_distribution_major_version' from source: facts 30583 1726853678.98892: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853678.99132: variable 'profile_stat' from source: set_fact 30583 1726853678.99147: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853678.99193: when evaluation is False, skipping this task 30583 1726853678.99200: _execute() done 30583 1726853678.99207: dumping result to json 30583 1726853678.99214: done dumping result, returning 30583 1726853678.99224: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-05ea-abc5-000000000400] 30583 1726853678.99232: sending task result for task 02083763-bbaf-05ea-abc5-000000000400 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853678.99450: no more pending results, returning what we have 30583 1726853678.99457: results queue empty 30583 1726853678.99459: checking for any_errors_fatal 30583 1726853678.99470: done checking for any_errors_fatal 30583 1726853678.99473: checking for max_fail_percentage 30583 1726853678.99475: done checking for max_fail_percentage 30583 1726853678.99477: checking to see if all hosts have failed and the running result is not ok 30583 1726853678.99477: done checking to see if all hosts have failed 30583 1726853678.99478: getting the remaining hosts for this loop 30583 1726853678.99481: done getting the remaining hosts for this loop 30583 1726853678.99485: getting the next task for host managed_node2 30583 1726853678.99493: done getting next task for host managed_node2 30583 1726853678.99496: ^ task is: TASK: Get NM profile info 30583 1726853678.99503: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853678.99508: getting variables 30583 1726853678.99510: in VariableManager get_vars() 30583 1726853678.99544: Calling all_inventory to load vars for managed_node2 30583 1726853678.99548: Calling groups_inventory to load vars for managed_node2 30583 1726853678.99552: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853678.99567: Calling all_plugins_play to load vars for managed_node2 30583 1726853678.99572: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853679.00008: Calling groups_plugins_play to load vars for managed_node2 30583 1726853679.00660: done sending task result for task 02083763-bbaf-05ea-abc5-000000000400 30583 1726853679.00664: WORKER PROCESS EXITING 30583 1726853679.01960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853679.04076: done with get_vars() 30583 1726853679.04097: done getting variables 30583 1726853679.04203: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:34:39 -0400 (0:00:00.083) 0:00:14.379 ****** 30583 1726853679.04237: entering _queue_task() for managed_node2/shell 30583 1726853679.04282: Creating lock for shell 30583 1726853679.04890: worker is 1 (out of 1 available) 30583 1726853679.04900: exiting _queue_task() for managed_node2/shell 30583 1726853679.04910: done queuing things up, now waiting for results queue to drain 30583 1726853679.04911: waiting for pending results... 30583 1726853679.05014: running TaskExecutor() for managed_node2/TASK: Get NM profile info 30583 1726853679.05143: in run() - task 02083763-bbaf-05ea-abc5-000000000401 30583 1726853679.05165: variable 'ansible_search_path' from source: unknown 30583 1726853679.05174: variable 'ansible_search_path' from source: unknown 30583 1726853679.05214: calling self._execute() 30583 1726853679.05316: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.05327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.05340: variable 'omit' from source: magic vars 30583 1726853679.05732: variable 'ansible_distribution_major_version' from source: facts 30583 1726853679.05750: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853679.05765: variable 'omit' from source: magic vars 30583 1726853679.05832: variable 'omit' from source: magic vars 30583 1726853679.05944: variable 'profile' from source: play vars 30583 1726853679.05954: variable 'interface' from source: play vars 30583 1726853679.06032: variable 'interface' from source: play vars 30583 1726853679.06058: variable 'omit' from source: magic vars 30583 1726853679.06115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853679.06154: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853679.06223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853679.06227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853679.06230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853679.06261: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853679.06269: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.06277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.06381: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853679.06443: Set connection var ansible_timeout to 10 30583 1726853679.06447: Set connection var ansible_connection to ssh 30583 1726853679.06449: Set connection var ansible_shell_executable to /bin/sh 30583 1726853679.06451: Set connection var ansible_shell_type to sh 30583 1726853679.06454: Set connection var ansible_pipelining to False 30583 1726853679.06466: variable 'ansible_shell_executable' from source: unknown 30583 1726853679.06475: variable 'ansible_connection' from source: unknown 30583 1726853679.06481: variable 'ansible_module_compression' from source: unknown 30583 1726853679.06487: variable 'ansible_shell_type' from source: unknown 30583 1726853679.06493: variable 'ansible_shell_executable' from source: unknown 30583 1726853679.06501: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.06582: variable 'ansible_pipelining' from source: unknown 30583 1726853679.06586: variable 'ansible_timeout' from source: unknown 30583 1726853679.06587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.07083: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853679.07087: variable 'omit' from source: magic vars 30583 1726853679.07091: starting attempt loop 30583 1726853679.07093: running the handler 30583 1726853679.07096: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853679.07098: _low_level_execute_command(): starting 30583 1726853679.07100: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853679.08346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853679.08350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853679.08353: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853679.08359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853679.08710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853679.10478: stdout chunk (state=3): >>>/root <<< 30583 1726853679.10780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853679.10784: stdout chunk (state=3): >>><<< 30583 1726853679.10786: stderr chunk (state=3): >>><<< 30583 1726853679.10790: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853679.10793: _low_level_execute_command(): starting 30583 1726853679.10797: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238 `" && echo ansible-tmp-1726853679.1069095-31235-138416252426238="` echo /root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238 `" ) && sleep 0' 30583 1726853679.11888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853679.11898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853679.11912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853679.11926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853679.11940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853679.11946: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853679.11958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853679.11970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853679.11979: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853679.11986: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853679.11993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853679.12002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853679.12013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853679.12020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853679.12026: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853679.12034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853679.12098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853679.12281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853679.12327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853679.12443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853679.14542: stdout chunk (state=3): >>>ansible-tmp-1726853679.1069095-31235-138416252426238=/root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238 <<< 30583 1726853679.14701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853679.14704: stdout chunk (state=3): >>><<< 30583 1726853679.14706: stderr chunk (state=3): >>><<< 30583 1726853679.14725: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853679.1069095-31235-138416252426238=/root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853679.14762: variable 'ansible_module_compression' from source: unknown 30583 1726853679.14876: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853679.14879: variable 'ansible_facts' from source: unknown 30583 1726853679.14968: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238/AnsiballZ_command.py 30583 1726853679.15201: Sending initial data 30583 1726853679.15210: Sent initial data (156 bytes) 30583 1726853679.15795: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853679.15817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853679.15919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853679.17604: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853679.17687: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853679.17782: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpjbpgg6rd /root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238/AnsiballZ_command.py <<< 30583 1726853679.17799: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238/AnsiballZ_command.py" <<< 30583 1726853679.17902: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpjbpgg6rd" to remote "/root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238/AnsiballZ_command.py" <<< 30583 1726853679.18890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853679.18894: stderr chunk (state=3): >>><<< 30583 1726853679.18896: stdout chunk (state=3): >>><<< 30583 1726853679.18905: done transferring module to remote 30583 1726853679.18927: _low_level_execute_command(): starting 30583 1726853679.18938: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238/ /root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238/AnsiballZ_command.py && sleep 0' 30583 1726853679.19602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853679.19693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853679.19725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853679.19751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853679.19791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853679.19875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853679.21974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853679.21979: stdout chunk (state=3): >>><<< 30583 1726853679.21982: stderr chunk (state=3): >>><<< 30583 1726853679.21984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853679.21987: _low_level_execute_command(): starting 30583 1726853679.21989: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238/AnsiballZ_command.py && sleep 0' 30583 1726853679.22534: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853679.22550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853679.22567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853679.22588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853679.22687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853679.22702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853679.22723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853679.22836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853679.40656: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 13:34:39.387611", "end": "2024-09-20 13:34:39.405371", "delta": "0:00:00.017760", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853679.42367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853679.42394: stderr chunk (state=3): >>><<< 30583 1726853679.42397: stdout chunk (state=3): >>><<< 30583 1726853679.42413: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 13:34:39.387611", "end": "2024-09-20 13:34:39.405371", "delta": "0:00:00.017760", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853679.42442: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853679.42449: _low_level_execute_command(): starting 30583 1726853679.42452: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853679.1069095-31235-138416252426238/ > /dev/null 2>&1 && sleep 0' 30583 1726853679.42909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853679.42914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853679.42917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853679.42919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853679.42922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853679.42975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853679.42979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853679.42981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853679.43049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853679.44959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853679.44986: stderr chunk (state=3): >>><<< 30583 1726853679.44990: stdout chunk (state=3): >>><<< 30583 1726853679.45002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853679.45008: handler run complete 30583 1726853679.45026: Evaluated conditional (False): False 30583 1726853679.45036: attempt loop complete, returning result 30583 1726853679.45040: _execute() done 30583 1726853679.45043: dumping result to json 30583 1726853679.45045: done dumping result, returning 30583 1726853679.45053: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [02083763-bbaf-05ea-abc5-000000000401] 30583 1726853679.45058: sending task result for task 02083763-bbaf-05ea-abc5-000000000401 30583 1726853679.45151: done sending task result for task 02083763-bbaf-05ea-abc5-000000000401 30583 1726853679.45157: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.017760", "end": "2024-09-20 13:34:39.405371", "rc": 0, "start": "2024-09-20 13:34:39.387611" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30583 1726853679.45223: no more pending results, returning what we have 30583 1726853679.45227: results queue empty 30583 1726853679.45228: checking for any_errors_fatal 30583 1726853679.45235: done checking for any_errors_fatal 30583 1726853679.45235: checking for max_fail_percentage 30583 1726853679.45237: done checking for max_fail_percentage 30583 1726853679.45238: checking to see if all hosts have failed and the running result is not ok 30583 1726853679.45239: done checking to see if all hosts have failed 30583 1726853679.45239: getting the remaining hosts for this loop 30583 1726853679.45241: done getting the remaining hosts for this loop 30583 1726853679.45245: getting the next task for host managed_node2 30583 1726853679.45252: done getting next task for host managed_node2 30583 1726853679.45254: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30583 1726853679.45261: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853679.45265: getting variables 30583 1726853679.45267: in VariableManager get_vars() 30583 1726853679.45301: Calling all_inventory to load vars for managed_node2 30583 1726853679.45303: Calling groups_inventory to load vars for managed_node2 30583 1726853679.45307: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853679.45317: Calling all_plugins_play to load vars for managed_node2 30583 1726853679.45319: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853679.45322: Calling groups_plugins_play to load vars for managed_node2 30583 1726853679.46100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853679.47012: done with get_vars() 30583 1726853679.47027: done getting variables 30583 1726853679.47075: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:34:39 -0400 (0:00:00.428) 0:00:14.808 ****** 30583 1726853679.47100: entering _queue_task() for managed_node2/set_fact 30583 1726853679.47340: worker is 1 (out of 1 available) 30583 1726853679.47357: exiting _queue_task() for managed_node2/set_fact 30583 1726853679.47369: done queuing things up, now waiting for results queue to drain 30583 1726853679.47372: waiting for pending results... 30583 1726853679.47550: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30583 1726853679.47630: in run() - task 02083763-bbaf-05ea-abc5-000000000402 30583 1726853679.47640: variable 'ansible_search_path' from source: unknown 30583 1726853679.47644: variable 'ansible_search_path' from source: unknown 30583 1726853679.47675: calling self._execute() 30583 1726853679.47748: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.47752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.47761: variable 'omit' from source: magic vars 30583 1726853679.48038: variable 'ansible_distribution_major_version' from source: facts 30583 1726853679.48048: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853679.48146: variable 'nm_profile_exists' from source: set_fact 30583 1726853679.48152: Evaluated conditional (nm_profile_exists.rc == 0): True 30583 1726853679.48160: variable 'omit' from source: magic vars 30583 1726853679.48195: variable 'omit' from source: magic vars 30583 1726853679.48217: variable 'omit' from source: magic vars 30583 1726853679.48252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853679.48283: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853679.48300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853679.48313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853679.48323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853679.48347: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853679.48350: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.48353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.48424: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853679.48428: Set connection var ansible_timeout to 10 30583 1726853679.48430: Set connection var ansible_connection to ssh 30583 1726853679.48436: Set connection var ansible_shell_executable to /bin/sh 30583 1726853679.48438: Set connection var ansible_shell_type to sh 30583 1726853679.48446: Set connection var ansible_pipelining to False 30583 1726853679.48467: variable 'ansible_shell_executable' from source: unknown 30583 1726853679.48470: variable 'ansible_connection' from source: unknown 30583 1726853679.48480: variable 'ansible_module_compression' from source: unknown 30583 1726853679.48483: variable 'ansible_shell_type' from source: unknown 30583 1726853679.48485: variable 'ansible_shell_executable' from source: unknown 30583 1726853679.48488: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.48491: variable 'ansible_pipelining' from source: unknown 30583 1726853679.48493: variable 'ansible_timeout' from source: unknown 30583 1726853679.48495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.48589: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853679.48599: variable 'omit' from source: magic vars 30583 1726853679.48605: starting attempt loop 30583 1726853679.48608: running the handler 30583 1726853679.48618: handler run complete 30583 1726853679.48627: attempt loop complete, returning result 30583 1726853679.48629: _execute() done 30583 1726853679.48632: dumping result to json 30583 1726853679.48635: done dumping result, returning 30583 1726853679.48642: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-05ea-abc5-000000000402] 30583 1726853679.48645: sending task result for task 02083763-bbaf-05ea-abc5-000000000402 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30583 1726853679.48781: no more pending results, returning what we have 30583 1726853679.48784: results queue empty 30583 1726853679.48785: checking for any_errors_fatal 30583 1726853679.48794: done checking for any_errors_fatal 30583 1726853679.48794: checking for max_fail_percentage 30583 1726853679.48796: done checking for max_fail_percentage 30583 1726853679.48797: checking to see if all hosts have failed and the running result is not ok 30583 1726853679.48799: done checking to see if all hosts have failed 30583 1726853679.48800: getting the remaining hosts for this loop 30583 1726853679.48802: done getting the remaining hosts for this loop 30583 1726853679.48805: getting the next task for host managed_node2 30583 1726853679.48815: done getting next task for host managed_node2 30583 1726853679.48818: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30583 1726853679.48822: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853679.48826: getting variables 30583 1726853679.48827: in VariableManager get_vars() 30583 1726853679.48859: Calling all_inventory to load vars for managed_node2 30583 1726853679.48862: Calling groups_inventory to load vars for managed_node2 30583 1726853679.48865: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853679.48881: Calling all_plugins_play to load vars for managed_node2 30583 1726853679.48883: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853679.48888: done sending task result for task 02083763-bbaf-05ea-abc5-000000000402 30583 1726853679.48891: WORKER PROCESS EXITING 30583 1726853679.48894: Calling groups_plugins_play to load vars for managed_node2 30583 1726853679.50345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853679.51201: done with get_vars() 30583 1726853679.51216: done getting variables 30583 1726853679.51260: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853679.51351: variable 'profile' from source: play vars 30583 1726853679.51354: variable 'interface' from source: play vars 30583 1726853679.51401: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:34:39 -0400 (0:00:00.043) 0:00:14.851 ****** 30583 1726853679.51424: entering _queue_task() for managed_node2/command 30583 1726853679.51677: worker is 1 (out of 1 available) 30583 1726853679.51689: exiting _queue_task() for managed_node2/command 30583 1726853679.51701: done queuing things up, now waiting for results queue to drain 30583 1726853679.51702: waiting for pending results... 30583 1726853679.51882: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 30583 1726853679.51961: in run() - task 02083763-bbaf-05ea-abc5-000000000404 30583 1726853679.51970: variable 'ansible_search_path' from source: unknown 30583 1726853679.51975: variable 'ansible_search_path' from source: unknown 30583 1726853679.52008: calling self._execute() 30583 1726853679.52084: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.52088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.52097: variable 'omit' from source: magic vars 30583 1726853679.52467: variable 'ansible_distribution_major_version' from source: facts 30583 1726853679.52497: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853679.52677: variable 'profile_stat' from source: set_fact 30583 1726853679.52984: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853679.52988: when evaluation is False, skipping this task 30583 1726853679.52991: _execute() done 30583 1726853679.52993: dumping result to json 30583 1726853679.52996: done dumping result, returning 30583 1726853679.53004: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000000404] 30583 1726853679.53006: sending task result for task 02083763-bbaf-05ea-abc5-000000000404 30583 1726853679.53096: done sending task result for task 02083763-bbaf-05ea-abc5-000000000404 30583 1726853679.53099: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853679.53206: no more pending results, returning what we have 30583 1726853679.53209: results queue empty 30583 1726853679.53210: checking for any_errors_fatal 30583 1726853679.53214: done checking for any_errors_fatal 30583 1726853679.53215: checking for max_fail_percentage 30583 1726853679.53217: done checking for max_fail_percentage 30583 1726853679.53217: checking to see if all hosts have failed and the running result is not ok 30583 1726853679.53218: done checking to see if all hosts have failed 30583 1726853679.53219: getting the remaining hosts for this loop 30583 1726853679.53220: done getting the remaining hosts for this loop 30583 1726853679.53223: getting the next task for host managed_node2 30583 1726853679.53229: done getting next task for host managed_node2 30583 1726853679.53231: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30583 1726853679.53235: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853679.53239: getting variables 30583 1726853679.53240: in VariableManager get_vars() 30583 1726853679.53267: Calling all_inventory to load vars for managed_node2 30583 1726853679.53269: Calling groups_inventory to load vars for managed_node2 30583 1726853679.53275: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853679.53284: Calling all_plugins_play to load vars for managed_node2 30583 1726853679.53286: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853679.53288: Calling groups_plugins_play to load vars for managed_node2 30583 1726853679.54566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853679.56201: done with get_vars() 30583 1726853679.56228: done getting variables 30583 1726853679.56306: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853679.56423: variable 'profile' from source: play vars 30583 1726853679.56427: variable 'interface' from source: play vars 30583 1726853679.56490: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:34:39 -0400 (0:00:00.050) 0:00:14.902 ****** 30583 1726853679.56518: entering _queue_task() for managed_node2/set_fact 30583 1726853679.57074: worker is 1 (out of 1 available) 30583 1726853679.57086: exiting _queue_task() for managed_node2/set_fact 30583 1726853679.57099: done queuing things up, now waiting for results queue to drain 30583 1726853679.57101: waiting for pending results... 30583 1726853679.57289: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 30583 1726853679.57438: in run() - task 02083763-bbaf-05ea-abc5-000000000405 30583 1726853679.57442: variable 'ansible_search_path' from source: unknown 30583 1726853679.57445: variable 'ansible_search_path' from source: unknown 30583 1726853679.57453: calling self._execute() 30583 1726853679.57573: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.57587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.57601: variable 'omit' from source: magic vars 30583 1726853679.58003: variable 'ansible_distribution_major_version' from source: facts 30583 1726853679.58022: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853679.58151: variable 'profile_stat' from source: set_fact 30583 1726853679.58173: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853679.58193: when evaluation is False, skipping this task 30583 1726853679.58197: _execute() done 30583 1726853679.58201: dumping result to json 30583 1726853679.58276: done dumping result, returning 30583 1726853679.58280: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000000405] 30583 1726853679.58283: sending task result for task 02083763-bbaf-05ea-abc5-000000000405 30583 1726853679.58577: done sending task result for task 02083763-bbaf-05ea-abc5-000000000405 30583 1726853679.58580: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853679.58620: no more pending results, returning what we have 30583 1726853679.58624: results queue empty 30583 1726853679.58625: checking for any_errors_fatal 30583 1726853679.58630: done checking for any_errors_fatal 30583 1726853679.58632: checking for max_fail_percentage 30583 1726853679.58634: done checking for max_fail_percentage 30583 1726853679.58635: checking to see if all hosts have failed and the running result is not ok 30583 1726853679.58636: done checking to see if all hosts have failed 30583 1726853679.58636: getting the remaining hosts for this loop 30583 1726853679.58638: done getting the remaining hosts for this loop 30583 1726853679.58641: getting the next task for host managed_node2 30583 1726853679.58648: done getting next task for host managed_node2 30583 1726853679.58651: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30583 1726853679.58658: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853679.58662: getting variables 30583 1726853679.58664: in VariableManager get_vars() 30583 1726853679.58694: Calling all_inventory to load vars for managed_node2 30583 1726853679.58697: Calling groups_inventory to load vars for managed_node2 30583 1726853679.58700: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853679.58710: Calling all_plugins_play to load vars for managed_node2 30583 1726853679.58713: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853679.58716: Calling groups_plugins_play to load vars for managed_node2 30583 1726853679.60267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853679.61787: done with get_vars() 30583 1726853679.61811: done getting variables 30583 1726853679.61876: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853679.61988: variable 'profile' from source: play vars 30583 1726853679.61992: variable 'interface' from source: play vars 30583 1726853679.62049: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:34:39 -0400 (0:00:00.055) 0:00:14.958 ****** 30583 1726853679.62085: entering _queue_task() for managed_node2/command 30583 1726853679.62437: worker is 1 (out of 1 available) 30583 1726853679.62449: exiting _queue_task() for managed_node2/command 30583 1726853679.62464: done queuing things up, now waiting for results queue to drain 30583 1726853679.62466: waiting for pending results... 30583 1726853679.62763: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 30583 1726853679.62891: in run() - task 02083763-bbaf-05ea-abc5-000000000406 30583 1726853679.62914: variable 'ansible_search_path' from source: unknown 30583 1726853679.62922: variable 'ansible_search_path' from source: unknown 30583 1726853679.62969: calling self._execute() 30583 1726853679.63079: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.63089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.63103: variable 'omit' from source: magic vars 30583 1726853679.63469: variable 'ansible_distribution_major_version' from source: facts 30583 1726853679.63487: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853679.63606: variable 'profile_stat' from source: set_fact 30583 1726853679.63622: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853679.63629: when evaluation is False, skipping this task 30583 1726853679.63636: _execute() done 30583 1726853679.63642: dumping result to json 30583 1726853679.63650: done dumping result, returning 30583 1726853679.63667: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000000406] 30583 1726853679.63677: sending task result for task 02083763-bbaf-05ea-abc5-000000000406 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853679.63827: no more pending results, returning what we have 30583 1726853679.63832: results queue empty 30583 1726853679.63833: checking for any_errors_fatal 30583 1726853679.63841: done checking for any_errors_fatal 30583 1726853679.63842: checking for max_fail_percentage 30583 1726853679.63844: done checking for max_fail_percentage 30583 1726853679.63845: checking to see if all hosts have failed and the running result is not ok 30583 1726853679.63846: done checking to see if all hosts have failed 30583 1726853679.63847: getting the remaining hosts for this loop 30583 1726853679.63848: done getting the remaining hosts for this loop 30583 1726853679.63853: getting the next task for host managed_node2 30583 1726853679.63864: done getting next task for host managed_node2 30583 1726853679.63867: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30583 1726853679.63874: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853679.63878: getting variables 30583 1726853679.63880: in VariableManager get_vars() 30583 1726853679.63913: Calling all_inventory to load vars for managed_node2 30583 1726853679.63916: Calling groups_inventory to load vars for managed_node2 30583 1726853679.63920: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853679.63933: Calling all_plugins_play to load vars for managed_node2 30583 1726853679.63936: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853679.63939: Calling groups_plugins_play to load vars for managed_node2 30583 1726853679.64784: done sending task result for task 02083763-bbaf-05ea-abc5-000000000406 30583 1726853679.64787: WORKER PROCESS EXITING 30583 1726853679.65565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853679.67139: done with get_vars() 30583 1726853679.67172: done getting variables 30583 1726853679.67238: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853679.67365: variable 'profile' from source: play vars 30583 1726853679.67370: variable 'interface' from source: play vars 30583 1726853679.67434: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:34:39 -0400 (0:00:00.053) 0:00:15.011 ****** 30583 1726853679.67473: entering _queue_task() for managed_node2/set_fact 30583 1726853679.67839: worker is 1 (out of 1 available) 30583 1726853679.67853: exiting _queue_task() for managed_node2/set_fact 30583 1726853679.67867: done queuing things up, now waiting for results queue to drain 30583 1726853679.67869: waiting for pending results... 30583 1726853679.68164: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 30583 1726853679.68324: in run() - task 02083763-bbaf-05ea-abc5-000000000407 30583 1726853679.68345: variable 'ansible_search_path' from source: unknown 30583 1726853679.68354: variable 'ansible_search_path' from source: unknown 30583 1726853679.68399: calling self._execute() 30583 1726853679.68495: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.68507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.68528: variable 'omit' from source: magic vars 30583 1726853679.68890: variable 'ansible_distribution_major_version' from source: facts 30583 1726853679.68906: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853679.69034: variable 'profile_stat' from source: set_fact 30583 1726853679.69051: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853679.69277: when evaluation is False, skipping this task 30583 1726853679.69281: _execute() done 30583 1726853679.69284: dumping result to json 30583 1726853679.69286: done dumping result, returning 30583 1726853679.69289: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000000407] 30583 1726853679.69291: sending task result for task 02083763-bbaf-05ea-abc5-000000000407 30583 1726853679.69364: done sending task result for task 02083763-bbaf-05ea-abc5-000000000407 30583 1726853679.69367: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853679.69417: no more pending results, returning what we have 30583 1726853679.69421: results queue empty 30583 1726853679.69422: checking for any_errors_fatal 30583 1726853679.69428: done checking for any_errors_fatal 30583 1726853679.69429: checking for max_fail_percentage 30583 1726853679.69432: done checking for max_fail_percentage 30583 1726853679.69433: checking to see if all hosts have failed and the running result is not ok 30583 1726853679.69434: done checking to see if all hosts have failed 30583 1726853679.69435: getting the remaining hosts for this loop 30583 1726853679.69437: done getting the remaining hosts for this loop 30583 1726853679.69440: getting the next task for host managed_node2 30583 1726853679.69451: done getting next task for host managed_node2 30583 1726853679.69454: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30583 1726853679.69460: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853679.69466: getting variables 30583 1726853679.69468: in VariableManager get_vars() 30583 1726853679.69503: Calling all_inventory to load vars for managed_node2 30583 1726853679.69506: Calling groups_inventory to load vars for managed_node2 30583 1726853679.69511: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853679.69525: Calling all_plugins_play to load vars for managed_node2 30583 1726853679.69528: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853679.69531: Calling groups_plugins_play to load vars for managed_node2 30583 1726853679.71143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853679.72724: done with get_vars() 30583 1726853679.72763: done getting variables 30583 1726853679.72833: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853679.72963: variable 'profile' from source: play vars 30583 1726853679.72967: variable 'interface' from source: play vars 30583 1726853679.73028: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:34:39 -0400 (0:00:00.055) 0:00:15.067 ****** 30583 1726853679.73066: entering _queue_task() for managed_node2/assert 30583 1726853679.73436: worker is 1 (out of 1 available) 30583 1726853679.73450: exiting _queue_task() for managed_node2/assert 30583 1726853679.73464: done queuing things up, now waiting for results queue to drain 30583 1726853679.73465: waiting for pending results... 30583 1726853679.73781: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' 30583 1726853679.73895: in run() - task 02083763-bbaf-05ea-abc5-000000000384 30583 1726853679.73915: variable 'ansible_search_path' from source: unknown 30583 1726853679.73921: variable 'ansible_search_path' from source: unknown 30583 1726853679.74076: calling self._execute() 30583 1726853679.74081: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.74084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.74092: variable 'omit' from source: magic vars 30583 1726853679.74458: variable 'ansible_distribution_major_version' from source: facts 30583 1726853679.74480: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853679.74491: variable 'omit' from source: magic vars 30583 1726853679.74543: variable 'omit' from source: magic vars 30583 1726853679.74650: variable 'profile' from source: play vars 30583 1726853679.74662: variable 'interface' from source: play vars 30583 1726853679.74729: variable 'interface' from source: play vars 30583 1726853679.74759: variable 'omit' from source: magic vars 30583 1726853679.74808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853679.74853: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853679.74884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853679.74961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853679.74964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853679.74967: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853679.74968: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.74972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.75074: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853679.75087: Set connection var ansible_timeout to 10 30583 1726853679.75093: Set connection var ansible_connection to ssh 30583 1726853679.75104: Set connection var ansible_shell_executable to /bin/sh 30583 1726853679.75111: Set connection var ansible_shell_type to sh 30583 1726853679.75124: Set connection var ansible_pipelining to False 30583 1726853679.75154: variable 'ansible_shell_executable' from source: unknown 30583 1726853679.75165: variable 'ansible_connection' from source: unknown 30583 1726853679.75275: variable 'ansible_module_compression' from source: unknown 30583 1726853679.75280: variable 'ansible_shell_type' from source: unknown 30583 1726853679.75283: variable 'ansible_shell_executable' from source: unknown 30583 1726853679.75285: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.75287: variable 'ansible_pipelining' from source: unknown 30583 1726853679.75289: variable 'ansible_timeout' from source: unknown 30583 1726853679.75291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.75348: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853679.75366: variable 'omit' from source: magic vars 30583 1726853679.75378: starting attempt loop 30583 1726853679.75384: running the handler 30583 1726853679.75500: variable 'lsr_net_profile_exists' from source: set_fact 30583 1726853679.75514: Evaluated conditional (lsr_net_profile_exists): True 30583 1726853679.75523: handler run complete 30583 1726853679.75541: attempt loop complete, returning result 30583 1726853679.75546: _execute() done 30583 1726853679.75552: dumping result to json 30583 1726853679.75622: done dumping result, returning 30583 1726853679.75625: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' [02083763-bbaf-05ea-abc5-000000000384] 30583 1726853679.75628: sending task result for task 02083763-bbaf-05ea-abc5-000000000384 30583 1726853679.75695: done sending task result for task 02083763-bbaf-05ea-abc5-000000000384 30583 1726853679.75698: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853679.75777: no more pending results, returning what we have 30583 1726853679.75781: results queue empty 30583 1726853679.75782: checking for any_errors_fatal 30583 1726853679.75790: done checking for any_errors_fatal 30583 1726853679.75791: checking for max_fail_percentage 30583 1726853679.75793: done checking for max_fail_percentage 30583 1726853679.75794: checking to see if all hosts have failed and the running result is not ok 30583 1726853679.75795: done checking to see if all hosts have failed 30583 1726853679.75796: getting the remaining hosts for this loop 30583 1726853679.75798: done getting the remaining hosts for this loop 30583 1726853679.75801: getting the next task for host managed_node2 30583 1726853679.75810: done getting next task for host managed_node2 30583 1726853679.75812: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30583 1726853679.75816: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853679.75821: getting variables 30583 1726853679.75823: in VariableManager get_vars() 30583 1726853679.75859: Calling all_inventory to load vars for managed_node2 30583 1726853679.75862: Calling groups_inventory to load vars for managed_node2 30583 1726853679.75866: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853679.75979: Calling all_plugins_play to load vars for managed_node2 30583 1726853679.75983: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853679.75986: Calling groups_plugins_play to load vars for managed_node2 30583 1726853679.77452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853679.79184: done with get_vars() 30583 1726853679.79208: done getting variables 30583 1726853679.79275: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853679.79394: variable 'profile' from source: play vars 30583 1726853679.79398: variable 'interface' from source: play vars 30583 1726853679.79459: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:34:39 -0400 (0:00:00.064) 0:00:15.132 ****** 30583 1726853679.79499: entering _queue_task() for managed_node2/assert 30583 1726853679.79847: worker is 1 (out of 1 available) 30583 1726853679.79860: exiting _queue_task() for managed_node2/assert 30583 1726853679.79976: done queuing things up, now waiting for results queue to drain 30583 1726853679.79978: waiting for pending results... 30583 1726853679.80289: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' 30583 1726853679.80294: in run() - task 02083763-bbaf-05ea-abc5-000000000385 30583 1726853679.80297: variable 'ansible_search_path' from source: unknown 30583 1726853679.80299: variable 'ansible_search_path' from source: unknown 30583 1726853679.80323: calling self._execute() 30583 1726853679.80419: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.80428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.80440: variable 'omit' from source: magic vars 30583 1726853679.80810: variable 'ansible_distribution_major_version' from source: facts 30583 1726853679.80828: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853679.80840: variable 'omit' from source: magic vars 30583 1726853679.80898: variable 'omit' from source: magic vars 30583 1726853679.81005: variable 'profile' from source: play vars 30583 1726853679.81066: variable 'interface' from source: play vars 30583 1726853679.81089: variable 'interface' from source: play vars 30583 1726853679.81116: variable 'omit' from source: magic vars 30583 1726853679.81163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853679.81208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853679.81234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853679.81259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853679.81280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853679.81376: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853679.81379: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.81381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.81437: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853679.81449: Set connection var ansible_timeout to 10 30583 1726853679.81459: Set connection var ansible_connection to ssh 30583 1726853679.81472: Set connection var ansible_shell_executable to /bin/sh 30583 1726853679.81480: Set connection var ansible_shell_type to sh 30583 1726853679.81500: Set connection var ansible_pipelining to False 30583 1726853679.81528: variable 'ansible_shell_executable' from source: unknown 30583 1726853679.81536: variable 'ansible_connection' from source: unknown 30583 1726853679.81604: variable 'ansible_module_compression' from source: unknown 30583 1726853679.81608: variable 'ansible_shell_type' from source: unknown 30583 1726853679.81610: variable 'ansible_shell_executable' from source: unknown 30583 1726853679.81612: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.81614: variable 'ansible_pipelining' from source: unknown 30583 1726853679.81616: variable 'ansible_timeout' from source: unknown 30583 1726853679.81619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.81728: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853679.81745: variable 'omit' from source: magic vars 30583 1726853679.81760: starting attempt loop 30583 1726853679.81768: running the handler 30583 1726853679.81881: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30583 1726853679.81891: Evaluated conditional (lsr_net_profile_ansible_managed): True 30583 1726853679.81901: handler run complete 30583 1726853679.81931: attempt loop complete, returning result 30583 1726853679.81934: _execute() done 30583 1726853679.81936: dumping result to json 30583 1726853679.82039: done dumping result, returning 30583 1726853679.82042: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' [02083763-bbaf-05ea-abc5-000000000385] 30583 1726853679.82045: sending task result for task 02083763-bbaf-05ea-abc5-000000000385 30583 1726853679.82114: done sending task result for task 02083763-bbaf-05ea-abc5-000000000385 30583 1726853679.82117: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853679.82198: no more pending results, returning what we have 30583 1726853679.82201: results queue empty 30583 1726853679.82202: checking for any_errors_fatal 30583 1726853679.82213: done checking for any_errors_fatal 30583 1726853679.82214: checking for max_fail_percentage 30583 1726853679.82216: done checking for max_fail_percentage 30583 1726853679.82217: checking to see if all hosts have failed and the running result is not ok 30583 1726853679.82218: done checking to see if all hosts have failed 30583 1726853679.82219: getting the remaining hosts for this loop 30583 1726853679.82221: done getting the remaining hosts for this loop 30583 1726853679.82225: getting the next task for host managed_node2 30583 1726853679.82234: done getting next task for host managed_node2 30583 1726853679.82237: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30583 1726853679.82241: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853679.82246: getting variables 30583 1726853679.82248: in VariableManager get_vars() 30583 1726853679.82285: Calling all_inventory to load vars for managed_node2 30583 1726853679.82288: Calling groups_inventory to load vars for managed_node2 30583 1726853679.82293: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853679.82304: Calling all_plugins_play to load vars for managed_node2 30583 1726853679.82307: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853679.82310: Calling groups_plugins_play to load vars for managed_node2 30583 1726853679.83865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853679.85462: done with get_vars() 30583 1726853679.85493: done getting variables 30583 1726853679.85554: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853679.85676: variable 'profile' from source: play vars 30583 1726853679.85680: variable 'interface' from source: play vars 30583 1726853679.85745: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:34:39 -0400 (0:00:00.062) 0:00:15.195 ****** 30583 1726853679.85782: entering _queue_task() for managed_node2/assert 30583 1726853679.86124: worker is 1 (out of 1 available) 30583 1726853679.86136: exiting _queue_task() for managed_node2/assert 30583 1726853679.86147: done queuing things up, now waiting for results queue to drain 30583 1726853679.86149: waiting for pending results... 30583 1726853679.86447: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr 30583 1726853679.86596: in run() - task 02083763-bbaf-05ea-abc5-000000000386 30583 1726853679.86600: variable 'ansible_search_path' from source: unknown 30583 1726853679.86602: variable 'ansible_search_path' from source: unknown 30583 1726853679.86639: calling self._execute() 30583 1726853679.86776: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.86781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.86783: variable 'omit' from source: magic vars 30583 1726853679.87136: variable 'ansible_distribution_major_version' from source: facts 30583 1726853679.87158: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853679.87376: variable 'omit' from source: magic vars 30583 1726853679.87380: variable 'omit' from source: magic vars 30583 1726853679.87382: variable 'profile' from source: play vars 30583 1726853679.87385: variable 'interface' from source: play vars 30583 1726853679.87401: variable 'interface' from source: play vars 30583 1726853679.87424: variable 'omit' from source: magic vars 30583 1726853679.87475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853679.87522: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853679.87545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853679.87569: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853679.87587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853679.87623: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853679.87632: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.87639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.87748: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853679.87761: Set connection var ansible_timeout to 10 30583 1726853679.87768: Set connection var ansible_connection to ssh 30583 1726853679.87778: Set connection var ansible_shell_executable to /bin/sh 30583 1726853679.87784: Set connection var ansible_shell_type to sh 30583 1726853679.87797: Set connection var ansible_pipelining to False 30583 1726853679.87825: variable 'ansible_shell_executable' from source: unknown 30583 1726853679.87834: variable 'ansible_connection' from source: unknown 30583 1726853679.87840: variable 'ansible_module_compression' from source: unknown 30583 1726853679.87845: variable 'ansible_shell_type' from source: unknown 30583 1726853679.87851: variable 'ansible_shell_executable' from source: unknown 30583 1726853679.87859: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.87866: variable 'ansible_pipelining' from source: unknown 30583 1726853679.87874: variable 'ansible_timeout' from source: unknown 30583 1726853679.87881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.88018: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853679.88035: variable 'omit' from source: magic vars 30583 1726853679.88048: starting attempt loop 30583 1726853679.88054: running the handler 30583 1726853679.88166: variable 'lsr_net_profile_fingerprint' from source: set_fact 30583 1726853679.88177: Evaluated conditional (lsr_net_profile_fingerprint): True 30583 1726853679.88260: handler run complete 30583 1726853679.88263: attempt loop complete, returning result 30583 1726853679.88266: _execute() done 30583 1726853679.88268: dumping result to json 30583 1726853679.88270: done dumping result, returning 30583 1726853679.88273: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr [02083763-bbaf-05ea-abc5-000000000386] 30583 1726853679.88275: sending task result for task 02083763-bbaf-05ea-abc5-000000000386 30583 1726853679.88336: done sending task result for task 02083763-bbaf-05ea-abc5-000000000386 30583 1726853679.88339: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853679.88413: no more pending results, returning what we have 30583 1726853679.88417: results queue empty 30583 1726853679.88418: checking for any_errors_fatal 30583 1726853679.88426: done checking for any_errors_fatal 30583 1726853679.88427: checking for max_fail_percentage 30583 1726853679.88430: done checking for max_fail_percentage 30583 1726853679.88431: checking to see if all hosts have failed and the running result is not ok 30583 1726853679.88432: done checking to see if all hosts have failed 30583 1726853679.88433: getting the remaining hosts for this loop 30583 1726853679.88435: done getting the remaining hosts for this loop 30583 1726853679.88439: getting the next task for host managed_node2 30583 1726853679.88448: done getting next task for host managed_node2 30583 1726853679.88452: ^ task is: TASK: Conditional asserts 30583 1726853679.88458: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853679.88464: getting variables 30583 1726853679.88466: in VariableManager get_vars() 30583 1726853679.88499: Calling all_inventory to load vars for managed_node2 30583 1726853679.88502: Calling groups_inventory to load vars for managed_node2 30583 1726853679.88506: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853679.88517: Calling all_plugins_play to load vars for managed_node2 30583 1726853679.88521: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853679.88524: Calling groups_plugins_play to load vars for managed_node2 30583 1726853679.90301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853679.91282: done with get_vars() 30583 1726853679.91299: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 13:34:39 -0400 (0:00:00.055) 0:00:15.250 ****** 30583 1726853679.91372: entering _queue_task() for managed_node2/include_tasks 30583 1726853679.91617: worker is 1 (out of 1 available) 30583 1726853679.91631: exiting _queue_task() for managed_node2/include_tasks 30583 1726853679.91644: done queuing things up, now waiting for results queue to drain 30583 1726853679.91646: waiting for pending results... 30583 1726853679.91830: running TaskExecutor() for managed_node2/TASK: Conditional asserts 30583 1726853679.91907: in run() - task 02083763-bbaf-05ea-abc5-000000000097 30583 1726853679.91918: variable 'ansible_search_path' from source: unknown 30583 1726853679.91921: variable 'ansible_search_path' from source: unknown 30583 1726853679.92131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853679.94027: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853679.94076: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853679.94107: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853679.94134: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853679.94154: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853679.94226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853679.94248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853679.94269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853679.94296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853679.94308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853679.94388: variable 'lsr_assert_when' from source: include params 30583 1726853679.94469: variable 'network_provider' from source: set_fact 30583 1726853679.94522: variable 'omit' from source: magic vars 30583 1726853679.94604: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853679.94611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853679.94619: variable 'omit' from source: magic vars 30583 1726853679.94752: variable 'ansible_distribution_major_version' from source: facts 30583 1726853679.94763: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853679.94839: variable 'item' from source: unknown 30583 1726853679.94845: Evaluated conditional (item['condition']): True 30583 1726853679.94904: variable 'item' from source: unknown 30583 1726853679.94926: variable 'item' from source: unknown 30583 1726853679.94975: variable 'item' from source: unknown 30583 1726853679.95108: dumping result to json 30583 1726853679.95111: done dumping result, returning 30583 1726853679.95113: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [02083763-bbaf-05ea-abc5-000000000097] 30583 1726853679.95115: sending task result for task 02083763-bbaf-05ea-abc5-000000000097 30583 1726853679.95149: done sending task result for task 02083763-bbaf-05ea-abc5-000000000097 30583 1726853679.95151: WORKER PROCESS EXITING 30583 1726853679.95174: no more pending results, returning what we have 30583 1726853679.95179: in VariableManager get_vars() 30583 1726853679.95212: Calling all_inventory to load vars for managed_node2 30583 1726853679.95214: Calling groups_inventory to load vars for managed_node2 30583 1726853679.95218: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853679.95227: Calling all_plugins_play to load vars for managed_node2 30583 1726853679.95230: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853679.95232: Calling groups_plugins_play to load vars for managed_node2 30583 1726853679.96509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853679.97563: done with get_vars() 30583 1726853679.97583: variable 'ansible_search_path' from source: unknown 30583 1726853679.97585: variable 'ansible_search_path' from source: unknown 30583 1726853679.97613: we have included files to process 30583 1726853679.97614: generating all_blocks data 30583 1726853679.97616: done generating all_blocks data 30583 1726853679.97620: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30583 1726853679.97621: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30583 1726853679.97623: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30583 1726853679.97736: in VariableManager get_vars() 30583 1726853679.97750: done with get_vars() 30583 1726853679.97827: done processing included file 30583 1726853679.97828: iterating over new_blocks loaded from include file 30583 1726853679.97829: in VariableManager get_vars() 30583 1726853679.97840: done with get_vars() 30583 1726853679.97842: filtering new block on tags 30583 1726853679.97866: done filtering new block on tags 30583 1726853679.97868: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 => (item={'what': 'tasks/assert_device_present.yml', 'condition': True}) 30583 1726853679.97873: extending task lists for all hosts with included blocks 30583 1726853679.98551: done extending task lists 30583 1726853679.98552: done processing included files 30583 1726853679.98553: results queue empty 30583 1726853679.98553: checking for any_errors_fatal 30583 1726853679.98557: done checking for any_errors_fatal 30583 1726853679.98558: checking for max_fail_percentage 30583 1726853679.98559: done checking for max_fail_percentage 30583 1726853679.98559: checking to see if all hosts have failed and the running result is not ok 30583 1726853679.98560: done checking to see if all hosts have failed 30583 1726853679.98560: getting the remaining hosts for this loop 30583 1726853679.98561: done getting the remaining hosts for this loop 30583 1726853679.98563: getting the next task for host managed_node2 30583 1726853679.98566: done getting next task for host managed_node2 30583 1726853679.98567: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30583 1726853679.98569: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853679.98578: getting variables 30583 1726853679.98578: in VariableManager get_vars() 30583 1726853679.98591: Calling all_inventory to load vars for managed_node2 30583 1726853679.98593: Calling groups_inventory to load vars for managed_node2 30583 1726853679.98595: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853679.98601: Calling all_plugins_play to load vars for managed_node2 30583 1726853679.98604: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853679.98607: Calling groups_plugins_play to load vars for managed_node2 30583 1726853679.99709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853680.01597: done with get_vars() 30583 1726853680.01625: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:34:40 -0400 (0:00:00.103) 0:00:15.354 ****** 30583 1726853680.01714: entering _queue_task() for managed_node2/include_tasks 30583 1726853680.02074: worker is 1 (out of 1 available) 30583 1726853680.02086: exiting _queue_task() for managed_node2/include_tasks 30583 1726853680.02100: done queuing things up, now waiting for results queue to drain 30583 1726853680.02102: waiting for pending results... 30583 1726853680.02557: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30583 1726853680.02660: in run() - task 02083763-bbaf-05ea-abc5-000000000452 30583 1726853680.02667: variable 'ansible_search_path' from source: unknown 30583 1726853680.02676: variable 'ansible_search_path' from source: unknown 30583 1726853680.02684: calling self._execute() 30583 1726853680.02687: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853680.02725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853680.02730: variable 'omit' from source: magic vars 30583 1726853680.03279: variable 'ansible_distribution_major_version' from source: facts 30583 1726853680.03477: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853680.03481: _execute() done 30583 1726853680.03483: dumping result to json 30583 1726853680.03485: done dumping result, returning 30583 1726853680.03487: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-05ea-abc5-000000000452] 30583 1726853680.03488: sending task result for task 02083763-bbaf-05ea-abc5-000000000452 30583 1726853680.03554: done sending task result for task 02083763-bbaf-05ea-abc5-000000000452 30583 1726853680.03560: WORKER PROCESS EXITING 30583 1726853680.03590: no more pending results, returning what we have 30583 1726853680.03595: in VariableManager get_vars() 30583 1726853680.03635: Calling all_inventory to load vars for managed_node2 30583 1726853680.03638: Calling groups_inventory to load vars for managed_node2 30583 1726853680.03641: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853680.03651: Calling all_plugins_play to load vars for managed_node2 30583 1726853680.03654: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853680.03657: Calling groups_plugins_play to load vars for managed_node2 30583 1726853680.05764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853680.07400: done with get_vars() 30583 1726853680.07419: variable 'ansible_search_path' from source: unknown 30583 1726853680.07421: variable 'ansible_search_path' from source: unknown 30583 1726853680.07563: variable 'item' from source: include params 30583 1726853680.07601: we have included files to process 30583 1726853680.07603: generating all_blocks data 30583 1726853680.07605: done generating all_blocks data 30583 1726853680.07606: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853680.07607: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853680.07610: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853680.07793: done processing included file 30583 1726853680.07795: iterating over new_blocks loaded from include file 30583 1726853680.07797: in VariableManager get_vars() 30583 1726853680.07813: done with get_vars() 30583 1726853680.07815: filtering new block on tags 30583 1726853680.07842: done filtering new block on tags 30583 1726853680.07844: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30583 1726853680.07849: extending task lists for all hosts with included blocks 30583 1726853680.08010: done extending task lists 30583 1726853680.08011: done processing included files 30583 1726853680.08012: results queue empty 30583 1726853680.08012: checking for any_errors_fatal 30583 1726853680.08016: done checking for any_errors_fatal 30583 1726853680.08017: checking for max_fail_percentage 30583 1726853680.08018: done checking for max_fail_percentage 30583 1726853680.08019: checking to see if all hosts have failed and the running result is not ok 30583 1726853680.08020: done checking to see if all hosts have failed 30583 1726853680.08020: getting the remaining hosts for this loop 30583 1726853680.08022: done getting the remaining hosts for this loop 30583 1726853680.08024: getting the next task for host managed_node2 30583 1726853680.08029: done getting next task for host managed_node2 30583 1726853680.08031: ^ task is: TASK: Get stat for interface {{ interface }} 30583 1726853680.08035: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853680.08037: getting variables 30583 1726853680.08038: in VariableManager get_vars() 30583 1726853680.08046: Calling all_inventory to load vars for managed_node2 30583 1726853680.08048: Calling groups_inventory to load vars for managed_node2 30583 1726853680.08051: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853680.08056: Calling all_plugins_play to load vars for managed_node2 30583 1726853680.08058: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853680.08061: Calling groups_plugins_play to load vars for managed_node2 30583 1726853680.09176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853680.10657: done with get_vars() 30583 1726853680.10680: done getting variables 30583 1726853680.10804: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:34:40 -0400 (0:00:00.091) 0:00:15.445 ****** 30583 1726853680.10835: entering _queue_task() for managed_node2/stat 30583 1726853680.11183: worker is 1 (out of 1 available) 30583 1726853680.11196: exiting _queue_task() for managed_node2/stat 30583 1726853680.11208: done queuing things up, now waiting for results queue to drain 30583 1726853680.11210: waiting for pending results... 30583 1726853680.11596: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30583 1726853680.11601: in run() - task 02083763-bbaf-05ea-abc5-0000000004e8 30583 1726853680.11617: variable 'ansible_search_path' from source: unknown 30583 1726853680.11621: variable 'ansible_search_path' from source: unknown 30583 1726853680.11658: calling self._execute() 30583 1726853680.11778: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853680.11783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853680.11787: variable 'omit' from source: magic vars 30583 1726853680.12114: variable 'ansible_distribution_major_version' from source: facts 30583 1726853680.12131: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853680.12138: variable 'omit' from source: magic vars 30583 1726853680.12266: variable 'omit' from source: magic vars 30583 1726853680.12283: variable 'interface' from source: play vars 30583 1726853680.12303: variable 'omit' from source: magic vars 30583 1726853680.12348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853680.12383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853680.12404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853680.12422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853680.12434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853680.12469: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853680.12474: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853680.12481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853680.12580: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853680.12590: Set connection var ansible_timeout to 10 30583 1726853680.12593: Set connection var ansible_connection to ssh 30583 1726853680.12596: Set connection var ansible_shell_executable to /bin/sh 30583 1726853680.12598: Set connection var ansible_shell_type to sh 30583 1726853680.12701: Set connection var ansible_pipelining to False 30583 1726853680.12704: variable 'ansible_shell_executable' from source: unknown 30583 1726853680.12707: variable 'ansible_connection' from source: unknown 30583 1726853680.12709: variable 'ansible_module_compression' from source: unknown 30583 1726853680.12711: variable 'ansible_shell_type' from source: unknown 30583 1726853680.12713: variable 'ansible_shell_executable' from source: unknown 30583 1726853680.12715: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853680.12716: variable 'ansible_pipelining' from source: unknown 30583 1726853680.12719: variable 'ansible_timeout' from source: unknown 30583 1726853680.12721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853680.12852: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853680.12863: variable 'omit' from source: magic vars 30583 1726853680.12870: starting attempt loop 30583 1726853680.12874: running the handler 30583 1726853680.12892: _low_level_execute_command(): starting 30583 1726853680.12900: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853680.13608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853680.13621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853680.13654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853680.13678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853680.13779: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853680.13788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853680.13886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853680.15621: stdout chunk (state=3): >>>/root <<< 30583 1726853680.15743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853680.15977: stderr chunk (state=3): >>><<< 30583 1726853680.15980: stdout chunk (state=3): >>><<< 30583 1726853680.15984: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853680.15987: _low_level_execute_command(): starting 30583 1726853680.15990: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415 `" && echo ansible-tmp-1726853680.1580954-31305-275546572329415="` echo /root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415 `" ) && sleep 0' 30583 1726853680.16419: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853680.16433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853680.16439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853680.16458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853680.16468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853680.16481: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853680.16487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853680.16503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853680.16511: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853680.16518: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853680.16526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853680.16541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853680.16548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853680.16558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853680.16563: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853680.16661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853680.16664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853680.16666: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853680.16669: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853680.16787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853680.19234: stdout chunk (state=3): >>>ansible-tmp-1726853680.1580954-31305-275546572329415=/root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415 <<< 30583 1726853680.19481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853680.19485: stdout chunk (state=3): >>><<< 30583 1726853680.19487: stderr chunk (state=3): >>><<< 30583 1726853680.19489: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853680.1580954-31305-275546572329415=/root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853680.19491: variable 'ansible_module_compression' from source: unknown 30583 1726853680.19515: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853680.19550: variable 'ansible_facts' from source: unknown 30583 1726853680.19648: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415/AnsiballZ_stat.py 30583 1726853680.19887: Sending initial data 30583 1726853680.19891: Sent initial data (153 bytes) 30583 1726853680.20411: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853680.20420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853680.20431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853680.20445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853680.20485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853680.20547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853680.20562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853680.20590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853680.20686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853680.22468: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853680.22502: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853680.22562: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpd5hwodkg /root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415/AnsiballZ_stat.py <<< 30583 1726853680.22613: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415/AnsiballZ_stat.py" <<< 30583 1726853680.22810: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpd5hwodkg" to remote "/root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415/AnsiballZ_stat.py" <<< 30583 1726853680.23842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853680.24076: stderr chunk (state=3): >>><<< 30583 1726853680.24080: stdout chunk (state=3): >>><<< 30583 1726853680.24082: done transferring module to remote 30583 1726853680.24084: _low_level_execute_command(): starting 30583 1726853680.24087: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415/ /root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415/AnsiballZ_stat.py && sleep 0' 30583 1726853680.24861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853680.25000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853680.25074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853680.27035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853680.27046: stdout chunk (state=3): >>><<< 30583 1726853680.27069: stderr chunk (state=3): >>><<< 30583 1726853680.27096: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853680.27104: _low_level_execute_command(): starting 30583 1726853680.27112: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415/AnsiballZ_stat.py && sleep 0' 30583 1726853680.27731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853680.27746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853680.27763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853680.27784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853680.27802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853680.27815: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853680.27838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853680.27860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853680.27876: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853680.27947: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853680.27980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853680.27996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853680.28018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853680.28123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853680.44138: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31901, "dev": 23, "nlink": 1, "atime": 1726853677.1936846, "mtime": 1726853677.1936846, "ctime": 1726853677.1936846, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853680.45478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853680.45482: stdout chunk (state=3): >>><<< 30583 1726853680.45493: stderr chunk (state=3): >>><<< 30583 1726853680.45540: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31901, "dev": 23, "nlink": 1, "atime": 1726853677.1936846, "mtime": 1726853677.1936846, "ctime": 1726853677.1936846, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853680.45846: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853680.45858: _low_level_execute_command(): starting 30583 1726853680.45862: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853680.1580954-31305-275546572329415/ > /dev/null 2>&1 && sleep 0' 30583 1726853680.47052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853680.47266: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853680.47446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853680.47464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853680.49578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853680.49591: stdout chunk (state=3): >>><<< 30583 1726853680.49607: stderr chunk (state=3): >>><<< 30583 1726853680.49628: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853680.49639: handler run complete 30583 1726853680.49812: attempt loop complete, returning result 30583 1726853680.49826: _execute() done 30583 1726853680.49835: dumping result to json 30583 1726853680.49845: done dumping result, returning 30583 1726853680.49890: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [02083763-bbaf-05ea-abc5-0000000004e8] 30583 1726853680.49899: sending task result for task 02083763-bbaf-05ea-abc5-0000000004e8 30583 1726853680.50482: done sending task result for task 02083763-bbaf-05ea-abc5-0000000004e8 30583 1726853680.50486: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726853677.1936846, "block_size": 4096, "blocks": 0, "ctime": 1726853677.1936846, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31901, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726853677.1936846, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30583 1726853680.50697: no more pending results, returning what we have 30583 1726853680.50701: results queue empty 30583 1726853680.50702: checking for any_errors_fatal 30583 1726853680.50704: done checking for any_errors_fatal 30583 1726853680.50704: checking for max_fail_percentage 30583 1726853680.50707: done checking for max_fail_percentage 30583 1726853680.50708: checking to see if all hosts have failed and the running result is not ok 30583 1726853680.50708: done checking to see if all hosts have failed 30583 1726853680.50709: getting the remaining hosts for this loop 30583 1726853680.50711: done getting the remaining hosts for this loop 30583 1726853680.50715: getting the next task for host managed_node2 30583 1726853680.50725: done getting next task for host managed_node2 30583 1726853680.50728: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30583 1726853680.50732: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853680.50750: getting variables 30583 1726853680.50752: in VariableManager get_vars() 30583 1726853680.51192: Calling all_inventory to load vars for managed_node2 30583 1726853680.51194: Calling groups_inventory to load vars for managed_node2 30583 1726853680.51198: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853680.51207: Calling all_plugins_play to load vars for managed_node2 30583 1726853680.51210: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853680.51213: Calling groups_plugins_play to load vars for managed_node2 30583 1726853680.54617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853680.58432: done with get_vars() 30583 1726853680.58468: done getting variables 30583 1726853680.58744: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853680.59006: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:34:40 -0400 (0:00:00.482) 0:00:15.927 ****** 30583 1726853680.59039: entering _queue_task() for managed_node2/assert 30583 1726853680.60084: worker is 1 (out of 1 available) 30583 1726853680.60099: exiting _queue_task() for managed_node2/assert 30583 1726853680.60112: done queuing things up, now waiting for results queue to drain 30583 1726853680.60113: waiting for pending results... 30583 1726853680.60680: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' 30583 1726853680.60774: in run() - task 02083763-bbaf-05ea-abc5-000000000453 30583 1726853680.61324: variable 'ansible_search_path' from source: unknown 30583 1726853680.61327: variable 'ansible_search_path' from source: unknown 30583 1726853680.61340: calling self._execute() 30583 1726853680.61433: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853680.61443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853680.61451: variable 'omit' from source: magic vars 30583 1726853680.62779: variable 'ansible_distribution_major_version' from source: facts 30583 1726853680.62782: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853680.62785: variable 'omit' from source: magic vars 30583 1726853680.62788: variable 'omit' from source: magic vars 30583 1726853680.63334: variable 'interface' from source: play vars 30583 1726853680.63352: variable 'omit' from source: magic vars 30583 1726853680.63393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853680.63427: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853680.63448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853680.63466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853680.63676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853680.63690: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853680.63694: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853680.63696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853680.63797: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853680.63803: Set connection var ansible_timeout to 10 30583 1726853680.63805: Set connection var ansible_connection to ssh 30583 1726853680.63811: Set connection var ansible_shell_executable to /bin/sh 30583 1726853680.63813: Set connection var ansible_shell_type to sh 30583 1726853680.63824: Set connection var ansible_pipelining to False 30583 1726853680.63849: variable 'ansible_shell_executable' from source: unknown 30583 1726853680.63852: variable 'ansible_connection' from source: unknown 30583 1726853680.63857: variable 'ansible_module_compression' from source: unknown 30583 1726853680.63860: variable 'ansible_shell_type' from source: unknown 30583 1726853680.63862: variable 'ansible_shell_executable' from source: unknown 30583 1726853680.63864: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853680.63866: variable 'ansible_pipelining' from source: unknown 30583 1726853680.63868: variable 'ansible_timeout' from source: unknown 30583 1726853680.63872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853680.64443: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853680.64447: variable 'omit' from source: magic vars 30583 1726853680.64449: starting attempt loop 30583 1726853680.64452: running the handler 30583 1726853680.65357: variable 'interface_stat' from source: set_fact 30583 1726853680.65374: Evaluated conditional (interface_stat.stat.exists): True 30583 1726853680.65379: handler run complete 30583 1726853680.65420: attempt loop complete, returning result 30583 1726853680.65423: _execute() done 30583 1726853680.65425: dumping result to json 30583 1726853680.65427: done dumping result, returning 30583 1726853680.65429: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' [02083763-bbaf-05ea-abc5-000000000453] 30583 1726853680.65432: sending task result for task 02083763-bbaf-05ea-abc5-000000000453 30583 1726853680.65513: done sending task result for task 02083763-bbaf-05ea-abc5-000000000453 30583 1726853680.65517: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853680.65590: no more pending results, returning what we have 30583 1726853680.65594: results queue empty 30583 1726853680.65595: checking for any_errors_fatal 30583 1726853680.65610: done checking for any_errors_fatal 30583 1726853680.65611: checking for max_fail_percentage 30583 1726853680.65613: done checking for max_fail_percentage 30583 1726853680.65614: checking to see if all hosts have failed and the running result is not ok 30583 1726853680.65616: done checking to see if all hosts have failed 30583 1726853680.65617: getting the remaining hosts for this loop 30583 1726853680.65620: done getting the remaining hosts for this loop 30583 1726853680.65627: getting the next task for host managed_node2 30583 1726853680.65650: done getting next task for host managed_node2 30583 1726853680.65654: ^ task is: TASK: Success in test '{{ lsr_description }}' 30583 1726853680.65659: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853680.65665: getting variables 30583 1726853680.65668: in VariableManager get_vars() 30583 1726853680.65702: Calling all_inventory to load vars for managed_node2 30583 1726853680.65705: Calling groups_inventory to load vars for managed_node2 30583 1726853680.65708: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853680.65719: Calling all_plugins_play to load vars for managed_node2 30583 1726853680.65721: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853680.65724: Calling groups_plugins_play to load vars for managed_node2 30583 1726853680.68748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853680.71878: done with get_vars() 30583 1726853680.71910: done getting variables 30583 1726853680.71982: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853680.72137: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile'] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 13:34:40 -0400 (0:00:00.131) 0:00:16.059 ****** 30583 1726853680.72192: entering _queue_task() for managed_node2/debug 30583 1726853680.72628: worker is 1 (out of 1 available) 30583 1726853680.72642: exiting _queue_task() for managed_node2/debug 30583 1726853680.72657: done queuing things up, now waiting for results queue to drain 30583 1726853680.72658: waiting for pending results... 30583 1726853680.73591: running TaskExecutor() for managed_node2/TASK: Success in test 'I can create a profile' 30583 1726853680.73598: in run() - task 02083763-bbaf-05ea-abc5-000000000098 30583 1726853680.73601: variable 'ansible_search_path' from source: unknown 30583 1726853680.73604: variable 'ansible_search_path' from source: unknown 30583 1726853680.73607: calling self._execute() 30583 1726853680.73636: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853680.73643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853680.73661: variable 'omit' from source: magic vars 30583 1726853680.74524: variable 'ansible_distribution_major_version' from source: facts 30583 1726853680.74536: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853680.74543: variable 'omit' from source: magic vars 30583 1726853680.74688: variable 'omit' from source: magic vars 30583 1726853680.74782: variable 'lsr_description' from source: include params 30583 1726853680.74800: variable 'omit' from source: magic vars 30583 1726853680.74839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853680.75345: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853680.75349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853680.75351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853680.75353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853680.75367: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853680.75373: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853680.75376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853680.75864: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853680.75874: Set connection var ansible_timeout to 10 30583 1726853680.75878: Set connection var ansible_connection to ssh 30583 1726853680.75884: Set connection var ansible_shell_executable to /bin/sh 30583 1726853680.75887: Set connection var ansible_shell_type to sh 30583 1726853680.75897: Set connection var ansible_pipelining to False 30583 1726853680.76070: variable 'ansible_shell_executable' from source: unknown 30583 1726853680.76076: variable 'ansible_connection' from source: unknown 30583 1726853680.76078: variable 'ansible_module_compression' from source: unknown 30583 1726853680.76080: variable 'ansible_shell_type' from source: unknown 30583 1726853680.76083: variable 'ansible_shell_executable' from source: unknown 30583 1726853680.76084: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853680.76086: variable 'ansible_pipelining' from source: unknown 30583 1726853680.76088: variable 'ansible_timeout' from source: unknown 30583 1726853680.76090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853680.76376: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853680.76388: variable 'omit' from source: magic vars 30583 1726853680.76394: starting attempt loop 30583 1726853680.76396: running the handler 30583 1726853680.76442: handler run complete 30583 1726853680.76455: attempt loop complete, returning result 30583 1726853680.76460: _execute() done 30583 1726853680.76466: dumping result to json 30583 1726853680.76473: done dumping result, returning 30583 1726853680.76476: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can create a profile' [02083763-bbaf-05ea-abc5-000000000098] 30583 1726853680.76481: sending task result for task 02083763-bbaf-05ea-abc5-000000000098 30583 1726853680.76566: done sending task result for task 02083763-bbaf-05ea-abc5-000000000098 30583 1726853680.76569: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'I can create a profile' +++++ 30583 1726853680.76622: no more pending results, returning what we have 30583 1726853680.76626: results queue empty 30583 1726853680.76627: checking for any_errors_fatal 30583 1726853680.76635: done checking for any_errors_fatal 30583 1726853680.76636: checking for max_fail_percentage 30583 1726853680.76638: done checking for max_fail_percentage 30583 1726853680.76639: checking to see if all hosts have failed and the running result is not ok 30583 1726853680.76640: done checking to see if all hosts have failed 30583 1726853680.76640: getting the remaining hosts for this loop 30583 1726853680.76643: done getting the remaining hosts for this loop 30583 1726853680.76647: getting the next task for host managed_node2 30583 1726853680.76657: done getting next task for host managed_node2 30583 1726853680.76660: ^ task is: TASK: Cleanup 30583 1726853680.76664: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853680.76670: getting variables 30583 1726853680.76674: in VariableManager get_vars() 30583 1726853680.76707: Calling all_inventory to load vars for managed_node2 30583 1726853680.76710: Calling groups_inventory to load vars for managed_node2 30583 1726853680.76714: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853680.76725: Calling all_plugins_play to load vars for managed_node2 30583 1726853680.76729: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853680.76732: Calling groups_plugins_play to load vars for managed_node2 30583 1726853680.87122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853680.90485: done with get_vars() 30583 1726853680.90518: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 13:34:40 -0400 (0:00:00.184) 0:00:16.243 ****** 30583 1726853680.90604: entering _queue_task() for managed_node2/include_tasks 30583 1726853680.91477: worker is 1 (out of 1 available) 30583 1726853680.91489: exiting _queue_task() for managed_node2/include_tasks 30583 1726853680.91501: done queuing things up, now waiting for results queue to drain 30583 1726853680.91503: waiting for pending results... 30583 1726853680.92223: running TaskExecutor() for managed_node2/TASK: Cleanup 30583 1726853680.92281: in run() - task 02083763-bbaf-05ea-abc5-00000000009c 30583 1726853680.92465: variable 'ansible_search_path' from source: unknown 30583 1726853680.92469: variable 'ansible_search_path' from source: unknown 30583 1726853680.92473: variable 'lsr_cleanup' from source: include params 30583 1726853680.93049: variable 'lsr_cleanup' from source: include params 30583 1726853680.93054: variable 'omit' from source: magic vars 30583 1726853680.93383: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853680.93616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853680.93620: variable 'omit' from source: magic vars 30583 1726853680.94007: variable 'ansible_distribution_major_version' from source: facts 30583 1726853680.94063: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853680.94178: variable 'item' from source: unknown 30583 1726853680.94240: variable 'item' from source: unknown 30583 1726853680.94577: variable 'item' from source: unknown 30583 1726853680.94581: variable 'item' from source: unknown 30583 1726853680.94930: dumping result to json 30583 1726853680.94933: done dumping result, returning 30583 1726853680.95279: done running TaskExecutor() for managed_node2/TASK: Cleanup [02083763-bbaf-05ea-abc5-00000000009c] 30583 1726853680.95282: sending task result for task 02083763-bbaf-05ea-abc5-00000000009c 30583 1726853680.95331: done sending task result for task 02083763-bbaf-05ea-abc5-00000000009c 30583 1726853680.95335: WORKER PROCESS EXITING 30583 1726853680.95364: no more pending results, returning what we have 30583 1726853680.95369: in VariableManager get_vars() 30583 1726853680.95410: Calling all_inventory to load vars for managed_node2 30583 1726853680.95413: Calling groups_inventory to load vars for managed_node2 30583 1726853680.95417: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853680.95430: Calling all_plugins_play to load vars for managed_node2 30583 1726853680.95434: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853680.95438: Calling groups_plugins_play to load vars for managed_node2 30583 1726853680.98231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853681.01873: done with get_vars() 30583 1726853681.01985: variable 'ansible_search_path' from source: unknown 30583 1726853681.01987: variable 'ansible_search_path' from source: unknown 30583 1726853681.02144: we have included files to process 30583 1726853681.02145: generating all_blocks data 30583 1726853681.02147: done generating all_blocks data 30583 1726853681.02151: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853681.02152: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853681.02158: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853681.02700: done processing included file 30583 1726853681.02702: iterating over new_blocks loaded from include file 30583 1726853681.02703: in VariableManager get_vars() 30583 1726853681.02719: done with get_vars() 30583 1726853681.02721: filtering new block on tags 30583 1726853681.02748: done filtering new block on tags 30583 1726853681.02751: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 30583 1726853681.02758: extending task lists for all hosts with included blocks 30583 1726853681.06020: done extending task lists 30583 1726853681.06176: done processing included files 30583 1726853681.06177: results queue empty 30583 1726853681.06177: checking for any_errors_fatal 30583 1726853681.06186: done checking for any_errors_fatal 30583 1726853681.06187: checking for max_fail_percentage 30583 1726853681.06188: done checking for max_fail_percentage 30583 1726853681.06189: checking to see if all hosts have failed and the running result is not ok 30583 1726853681.06190: done checking to see if all hosts have failed 30583 1726853681.06190: getting the remaining hosts for this loop 30583 1726853681.06192: done getting the remaining hosts for this loop 30583 1726853681.06195: getting the next task for host managed_node2 30583 1726853681.06200: done getting next task for host managed_node2 30583 1726853681.06202: ^ task is: TASK: Cleanup profile and device 30583 1726853681.06205: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853681.06208: getting variables 30583 1726853681.06209: in VariableManager get_vars() 30583 1726853681.06222: Calling all_inventory to load vars for managed_node2 30583 1726853681.06224: Calling groups_inventory to load vars for managed_node2 30583 1726853681.06227: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853681.06234: Calling all_plugins_play to load vars for managed_node2 30583 1726853681.06236: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853681.06239: Calling groups_plugins_play to load vars for managed_node2 30583 1726853681.09693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853681.13124: done with get_vars() 30583 1726853681.13156: done getting variables 30583 1726853681.13319: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 13:34:41 -0400 (0:00:00.227) 0:00:16.470 ****** 30583 1726853681.13351: entering _queue_task() for managed_node2/shell 30583 1726853681.14103: worker is 1 (out of 1 available) 30583 1726853681.14116: exiting _queue_task() for managed_node2/shell 30583 1726853681.14129: done queuing things up, now waiting for results queue to drain 30583 1726853681.14130: waiting for pending results... 30583 1726853681.14832: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 30583 1726853681.15078: in run() - task 02083763-bbaf-05ea-abc5-00000000050b 30583 1726853681.15123: variable 'ansible_search_path' from source: unknown 30583 1726853681.15228: variable 'ansible_search_path' from source: unknown 30583 1726853681.15239: calling self._execute() 30583 1726853681.15456: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853681.15521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853681.15537: variable 'omit' from source: magic vars 30583 1726853681.16422: variable 'ansible_distribution_major_version' from source: facts 30583 1726853681.16426: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853681.16428: variable 'omit' from source: magic vars 30583 1726853681.16472: variable 'omit' from source: magic vars 30583 1726853681.16839: variable 'interface' from source: play vars 30583 1726853681.16873: variable 'omit' from source: magic vars 30583 1726853681.16965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853681.17185: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853681.17188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853681.17194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853681.17212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853681.17275: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853681.17402: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853681.17405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853681.17534: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853681.17580: Set connection var ansible_timeout to 10 30583 1726853681.17676: Set connection var ansible_connection to ssh 30583 1726853681.17679: Set connection var ansible_shell_executable to /bin/sh 30583 1726853681.17682: Set connection var ansible_shell_type to sh 30583 1726853681.17685: Set connection var ansible_pipelining to False 30583 1726853681.17687: variable 'ansible_shell_executable' from source: unknown 30583 1726853681.17689: variable 'ansible_connection' from source: unknown 30583 1726853681.17692: variable 'ansible_module_compression' from source: unknown 30583 1726853681.17730: variable 'ansible_shell_type' from source: unknown 30583 1726853681.17837: variable 'ansible_shell_executable' from source: unknown 30583 1726853681.17841: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853681.17843: variable 'ansible_pipelining' from source: unknown 30583 1726853681.17846: variable 'ansible_timeout' from source: unknown 30583 1726853681.17848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853681.18119: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853681.18137: variable 'omit' from source: magic vars 30583 1726853681.18175: starting attempt loop 30583 1726853681.18184: running the handler 30583 1726853681.18198: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853681.18276: _low_level_execute_command(): starting 30583 1726853681.18280: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853681.19841: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853681.19887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853681.19980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853681.20086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853681.21850: stdout chunk (state=3): >>>/root <<< 30583 1726853681.22024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853681.22047: stderr chunk (state=3): >>><<< 30583 1726853681.22051: stdout chunk (state=3): >>><<< 30583 1726853681.22196: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853681.22212: _low_level_execute_command(): starting 30583 1726853681.22219: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639 `" && echo ansible-tmp-1726853681.2219682-31345-162173726841639="` echo /root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639 `" ) && sleep 0' 30583 1726853681.23439: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853681.23443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853681.23544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853681.23548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853681.23551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853681.23554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853681.23556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853681.23680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853681.23797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853681.25865: stdout chunk (state=3): >>>ansible-tmp-1726853681.2219682-31345-162173726841639=/root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639 <<< 30583 1726853681.25977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853681.26086: stderr chunk (state=3): >>><<< 30583 1726853681.26089: stdout chunk (state=3): >>><<< 30583 1726853681.26375: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853681.2219682-31345-162173726841639=/root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853681.26380: variable 'ansible_module_compression' from source: unknown 30583 1726853681.26382: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853681.26417: variable 'ansible_facts' from source: unknown 30583 1726853681.26614: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639/AnsiballZ_command.py 30583 1726853681.26838: Sending initial data 30583 1726853681.26841: Sent initial data (156 bytes) 30583 1726853681.28393: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853681.28425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853681.28437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853681.28491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853681.28626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853681.30360: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853681.30542: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853681.30546: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp4okrznsf /root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639/AnsiballZ_command.py <<< 30583 1726853681.30548: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639/AnsiballZ_command.py" <<< 30583 1726853681.30707: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp4okrznsf" to remote "/root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639/AnsiballZ_command.py" <<< 30583 1726853681.32552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853681.32556: stderr chunk (state=3): >>><<< 30583 1726853681.32561: stdout chunk (state=3): >>><<< 30583 1726853681.32615: done transferring module to remote 30583 1726853681.32625: _low_level_execute_command(): starting 30583 1726853681.32631: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639/ /root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639/AnsiballZ_command.py && sleep 0' 30583 1726853681.33729: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853681.33733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853681.33760: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853681.33768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853681.33941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853681.33945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853681.34063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853681.34256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853681.36090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853681.36098: stderr chunk (state=3): >>><<< 30583 1726853681.36102: stdout chunk (state=3): >>><<< 30583 1726853681.36121: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853681.36125: _low_level_execute_command(): starting 30583 1726853681.36209: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639/AnsiballZ_command.py && sleep 0' 30583 1726853681.37234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853681.37239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853681.37413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853681.37736: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853681.37895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853681.59577: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (cd4fb572-41c5-436a-affc-f73b867bbd77) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 13:34:41.535830", "end": "2024-09-20 13:34:41.594583", "delta": "0:00:00.058753", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853681.61324: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. <<< 30583 1726853681.61328: stderr chunk (state=3): >>><<< 30583 1726853681.61331: stdout chunk (state=3): >>><<< 30583 1726853681.61352: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (cd4fb572-41c5-436a-affc-f73b867bbd77) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 13:34:41.535830", "end": "2024-09-20 13:34:41.594583", "delta": "0:00:00.058753", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. 30583 1726853681.61398: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853681.61776: _low_level_execute_command(): starting 30583 1726853681.61780: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853681.2219682-31345-162173726841639/ > /dev/null 2>&1 && sleep 0' 30583 1726853681.62818: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853681.62894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853681.63016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853681.63063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853681.63119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853681.63187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853681.65178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853681.65182: stdout chunk (state=3): >>><<< 30583 1726853681.65184: stderr chunk (state=3): >>><<< 30583 1726853681.65478: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853681.65482: handler run complete 30583 1726853681.65484: Evaluated conditional (False): False 30583 1726853681.65486: attempt loop complete, returning result 30583 1726853681.65488: _execute() done 30583 1726853681.65490: dumping result to json 30583 1726853681.65492: done dumping result, returning 30583 1726853681.65494: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [02083763-bbaf-05ea-abc5-00000000050b] 30583 1726853681.65496: sending task result for task 02083763-bbaf-05ea-abc5-00000000050b 30583 1726853681.65563: done sending task result for task 02083763-bbaf-05ea-abc5-00000000050b 30583 1726853681.65566: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.058753", "end": "2024-09-20 13:34:41.594583", "rc": 1, "start": "2024-09-20 13:34:41.535830" } STDOUT: Connection 'statebr' (cd4fb572-41c5-436a-affc-f73b867bbd77) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30583 1726853681.65646: no more pending results, returning what we have 30583 1726853681.65651: results queue empty 30583 1726853681.65652: checking for any_errors_fatal 30583 1726853681.65653: done checking for any_errors_fatal 30583 1726853681.65654: checking for max_fail_percentage 30583 1726853681.65659: done checking for max_fail_percentage 30583 1726853681.65660: checking to see if all hosts have failed and the running result is not ok 30583 1726853681.65660: done checking to see if all hosts have failed 30583 1726853681.65661: getting the remaining hosts for this loop 30583 1726853681.65663: done getting the remaining hosts for this loop 30583 1726853681.65668: getting the next task for host managed_node2 30583 1726853681.65683: done getting next task for host managed_node2 30583 1726853681.65687: ^ task is: TASK: Include the task 'run_test.yml' 30583 1726853681.65689: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853681.65694: getting variables 30583 1726853681.65696: in VariableManager get_vars() 30583 1726853681.65727: Calling all_inventory to load vars for managed_node2 30583 1726853681.65730: Calling groups_inventory to load vars for managed_node2 30583 1726853681.65734: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853681.65744: Calling all_plugins_play to load vars for managed_node2 30583 1726853681.65747: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853681.65750: Calling groups_plugins_play to load vars for managed_node2 30583 1726853681.68866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853681.72048: done with get_vars() 30583 1726853681.72089: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:45 Friday 20 September 2024 13:34:41 -0400 (0:00:00.590) 0:00:17.061 ****** 30583 1726853681.72406: entering _queue_task() for managed_node2/include_tasks 30583 1726853681.73045: worker is 1 (out of 1 available) 30583 1726853681.73063: exiting _queue_task() for managed_node2/include_tasks 30583 1726853681.73082: done queuing things up, now waiting for results queue to drain 30583 1726853681.73084: waiting for pending results... 30583 1726853681.73565: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 30583 1726853681.73638: in run() - task 02083763-bbaf-05ea-abc5-00000000000f 30583 1726853681.73650: variable 'ansible_search_path' from source: unknown 30583 1726853681.73694: calling self._execute() 30583 1726853681.73969: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853681.73975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853681.73979: variable 'omit' from source: magic vars 30583 1726853681.74375: variable 'ansible_distribution_major_version' from source: facts 30583 1726853681.74412: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853681.74415: _execute() done 30583 1726853681.74418: dumping result to json 30583 1726853681.74421: done dumping result, returning 30583 1726853681.74427: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [02083763-bbaf-05ea-abc5-00000000000f] 30583 1726853681.74430: sending task result for task 02083763-bbaf-05ea-abc5-00000000000f 30583 1726853681.74681: done sending task result for task 02083763-bbaf-05ea-abc5-00000000000f 30583 1726853681.74683: WORKER PROCESS EXITING 30583 1726853681.74741: no more pending results, returning what we have 30583 1726853681.74759: in VariableManager get_vars() 30583 1726853681.74798: Calling all_inventory to load vars for managed_node2 30583 1726853681.74801: Calling groups_inventory to load vars for managed_node2 30583 1726853681.74805: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853681.74819: Calling all_plugins_play to load vars for managed_node2 30583 1726853681.74821: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853681.74824: Calling groups_plugins_play to load vars for managed_node2 30583 1726853681.76531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853681.80097: done with get_vars() 30583 1726853681.80128: variable 'ansible_search_path' from source: unknown 30583 1726853681.80147: we have included files to process 30583 1726853681.80148: generating all_blocks data 30583 1726853681.80151: done generating all_blocks data 30583 1726853681.80161: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853681.80162: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853681.80166: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853681.80675: in VariableManager get_vars() 30583 1726853681.80695: done with get_vars() 30583 1726853681.80739: in VariableManager get_vars() 30583 1726853681.80760: done with get_vars() 30583 1726853681.80847: in VariableManager get_vars() 30583 1726853681.80867: done with get_vars() 30583 1726853681.81058: in VariableManager get_vars() 30583 1726853681.81091: done with get_vars() 30583 1726853681.81144: in VariableManager get_vars() 30583 1726853681.81160: done with get_vars() 30583 1726853681.82241: in VariableManager get_vars() 30583 1726853681.82260: done with get_vars() 30583 1726853681.82274: done processing included file 30583 1726853681.82276: iterating over new_blocks loaded from include file 30583 1726853681.82277: in VariableManager get_vars() 30583 1726853681.82288: done with get_vars() 30583 1726853681.82289: filtering new block on tags 30583 1726853681.82539: done filtering new block on tags 30583 1726853681.82542: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 30583 1726853681.82547: extending task lists for all hosts with included blocks 30583 1726853681.82586: done extending task lists 30583 1726853681.82587: done processing included files 30583 1726853681.82588: results queue empty 30583 1726853681.82589: checking for any_errors_fatal 30583 1726853681.82593: done checking for any_errors_fatal 30583 1726853681.82594: checking for max_fail_percentage 30583 1726853681.82595: done checking for max_fail_percentage 30583 1726853681.82596: checking to see if all hosts have failed and the running result is not ok 30583 1726853681.82597: done checking to see if all hosts have failed 30583 1726853681.82597: getting the remaining hosts for this loop 30583 1726853681.82599: done getting the remaining hosts for this loop 30583 1726853681.82602: getting the next task for host managed_node2 30583 1726853681.82606: done getting next task for host managed_node2 30583 1726853681.82608: ^ task is: TASK: TEST: {{ lsr_description }} 30583 1726853681.82611: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853681.82614: getting variables 30583 1726853681.82615: in VariableManager get_vars() 30583 1726853681.82637: Calling all_inventory to load vars for managed_node2 30583 1726853681.82639: Calling groups_inventory to load vars for managed_node2 30583 1726853681.82642: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853681.82648: Calling all_plugins_play to load vars for managed_node2 30583 1726853681.82650: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853681.82653: Calling groups_plugins_play to load vars for managed_node2 30583 1726853681.84298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853681.87061: done with get_vars() 30583 1726853681.87147: done getting variables 30583 1726853681.87307: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853681.87704: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile without autoconnect] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 13:34:41 -0400 (0:00:00.153) 0:00:17.215 ****** 30583 1726853681.87805: entering _queue_task() for managed_node2/debug 30583 1726853681.88631: worker is 1 (out of 1 available) 30583 1726853681.88645: exiting _queue_task() for managed_node2/debug 30583 1726853681.88657: done queuing things up, now waiting for results queue to drain 30583 1726853681.88658: waiting for pending results... 30583 1726853681.89527: running TaskExecutor() for managed_node2/TASK: TEST: I can create a profile without autoconnect 30583 1726853681.89533: in run() - task 02083763-bbaf-05ea-abc5-0000000005b4 30583 1726853681.89536: variable 'ansible_search_path' from source: unknown 30583 1726853681.89539: variable 'ansible_search_path' from source: unknown 30583 1726853681.89542: calling self._execute() 30583 1726853681.89608: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853681.89614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853681.89622: variable 'omit' from source: magic vars 30583 1726853681.90427: variable 'ansible_distribution_major_version' from source: facts 30583 1726853681.90554: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853681.90558: variable 'omit' from source: magic vars 30583 1726853681.90595: variable 'omit' from source: magic vars 30583 1726853681.90832: variable 'lsr_description' from source: include params 30583 1726853681.90850: variable 'omit' from source: magic vars 30583 1726853681.91039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853681.91119: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853681.91257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853681.91346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853681.91349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853681.91425: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853681.91429: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853681.91431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853681.91544: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853681.91547: Set connection var ansible_timeout to 10 30583 1726853681.91550: Set connection var ansible_connection to ssh 30583 1726853681.91553: Set connection var ansible_shell_executable to /bin/sh 30583 1726853681.91565: Set connection var ansible_shell_type to sh 30583 1726853681.91568: Set connection var ansible_pipelining to False 30583 1726853681.91599: variable 'ansible_shell_executable' from source: unknown 30583 1726853681.91603: variable 'ansible_connection' from source: unknown 30583 1726853681.91606: variable 'ansible_module_compression' from source: unknown 30583 1726853681.91608: variable 'ansible_shell_type' from source: unknown 30583 1726853681.91611: variable 'ansible_shell_executable' from source: unknown 30583 1726853681.91653: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853681.91656: variable 'ansible_pipelining' from source: unknown 30583 1726853681.91659: variable 'ansible_timeout' from source: unknown 30583 1726853681.91662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853681.91865: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853681.91874: variable 'omit' from source: magic vars 30583 1726853681.91877: starting attempt loop 30583 1726853681.91881: running the handler 30583 1726853681.91902: handler run complete 30583 1726853681.91951: attempt loop complete, returning result 30583 1726853681.91954: _execute() done 30583 1726853681.91956: dumping result to json 30583 1726853681.91958: done dumping result, returning 30583 1726853681.91979: done running TaskExecutor() for managed_node2/TASK: TEST: I can create a profile without autoconnect [02083763-bbaf-05ea-abc5-0000000005b4] 30583 1726853681.92075: sending task result for task 02083763-bbaf-05ea-abc5-0000000005b4 30583 1726853681.92146: done sending task result for task 02083763-bbaf-05ea-abc5-0000000005b4 30583 1726853681.92149: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I can create a profile without autoconnect ########## 30583 1726853681.92227: no more pending results, returning what we have 30583 1726853681.92251: results queue empty 30583 1726853681.92253: checking for any_errors_fatal 30583 1726853681.92257: done checking for any_errors_fatal 30583 1726853681.92257: checking for max_fail_percentage 30583 1726853681.92260: done checking for max_fail_percentage 30583 1726853681.92260: checking to see if all hosts have failed and the running result is not ok 30583 1726853681.92261: done checking to see if all hosts have failed 30583 1726853681.92262: getting the remaining hosts for this loop 30583 1726853681.92264: done getting the remaining hosts for this loop 30583 1726853681.92268: getting the next task for host managed_node2 30583 1726853681.92278: done getting next task for host managed_node2 30583 1726853681.92282: ^ task is: TASK: Show item 30583 1726853681.92286: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853681.92291: getting variables 30583 1726853681.92292: in VariableManager get_vars() 30583 1726853681.92323: Calling all_inventory to load vars for managed_node2 30583 1726853681.92326: Calling groups_inventory to load vars for managed_node2 30583 1726853681.92330: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853681.92341: Calling all_plugins_play to load vars for managed_node2 30583 1726853681.92344: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853681.92347: Calling groups_plugins_play to load vars for managed_node2 30583 1726853681.94624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853681.97178: done with get_vars() 30583 1726853681.97200: done getting variables 30583 1726853681.97279: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 13:34:41 -0400 (0:00:00.095) 0:00:17.310 ****** 30583 1726853681.97308: entering _queue_task() for managed_node2/debug 30583 1726853681.97776: worker is 1 (out of 1 available) 30583 1726853681.97879: exiting _queue_task() for managed_node2/debug 30583 1726853681.97891: done queuing things up, now waiting for results queue to drain 30583 1726853681.97892: waiting for pending results... 30583 1726853681.98113: running TaskExecutor() for managed_node2/TASK: Show item 30583 1726853681.98207: in run() - task 02083763-bbaf-05ea-abc5-0000000005b5 30583 1726853681.98216: variable 'ansible_search_path' from source: unknown 30583 1726853681.98220: variable 'ansible_search_path' from source: unknown 30583 1726853681.98312: variable 'omit' from source: magic vars 30583 1726853681.98531: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853681.98534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853681.98537: variable 'omit' from source: magic vars 30583 1726853681.98895: variable 'ansible_distribution_major_version' from source: facts 30583 1726853681.98961: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853681.98965: variable 'omit' from source: magic vars 30583 1726853681.98967: variable 'omit' from source: magic vars 30583 1726853681.98993: variable 'item' from source: unknown 30583 1726853681.99070: variable 'item' from source: unknown 30583 1726853681.99085: variable 'omit' from source: magic vars 30583 1726853681.99124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853681.99165: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853681.99186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853681.99204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853681.99302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853681.99305: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853681.99308: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853681.99310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853681.99354: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853681.99363: Set connection var ansible_timeout to 10 30583 1726853681.99366: Set connection var ansible_connection to ssh 30583 1726853681.99379: Set connection var ansible_shell_executable to /bin/sh 30583 1726853681.99382: Set connection var ansible_shell_type to sh 30583 1726853681.99392: Set connection var ansible_pipelining to False 30583 1726853681.99415: variable 'ansible_shell_executable' from source: unknown 30583 1726853681.99418: variable 'ansible_connection' from source: unknown 30583 1726853681.99519: variable 'ansible_module_compression' from source: unknown 30583 1726853681.99522: variable 'ansible_shell_type' from source: unknown 30583 1726853681.99525: variable 'ansible_shell_executable' from source: unknown 30583 1726853681.99529: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853681.99531: variable 'ansible_pipelining' from source: unknown 30583 1726853681.99534: variable 'ansible_timeout' from source: unknown 30583 1726853681.99536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853681.99576: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853681.99591: variable 'omit' from source: magic vars 30583 1726853681.99628: starting attempt loop 30583 1726853681.99631: running the handler 30583 1726853681.99642: variable 'lsr_description' from source: include params 30583 1726853681.99715: variable 'lsr_description' from source: include params 30583 1726853681.99725: handler run complete 30583 1726853681.99743: attempt loop complete, returning result 30583 1726853681.99761: variable 'item' from source: unknown 30583 1726853681.99825: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile without autoconnect" } 30583 1726853682.00215: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.00219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.00222: variable 'omit' from source: magic vars 30583 1726853682.00225: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.00228: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.00231: variable 'omit' from source: magic vars 30583 1726853682.00234: variable 'omit' from source: magic vars 30583 1726853682.00238: variable 'item' from source: unknown 30583 1726853682.00241: variable 'item' from source: unknown 30583 1726853682.00243: variable 'omit' from source: magic vars 30583 1726853682.00474: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853682.00477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.00480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.00483: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853682.00485: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.00487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.00490: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853682.00492: Set connection var ansible_timeout to 10 30583 1726853682.00494: Set connection var ansible_connection to ssh 30583 1726853682.00497: Set connection var ansible_shell_executable to /bin/sh 30583 1726853682.00499: Set connection var ansible_shell_type to sh 30583 1726853682.00501: Set connection var ansible_pipelining to False 30583 1726853682.00503: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.00505: variable 'ansible_connection' from source: unknown 30583 1726853682.00507: variable 'ansible_module_compression' from source: unknown 30583 1726853682.00509: variable 'ansible_shell_type' from source: unknown 30583 1726853682.00511: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.00513: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.00515: variable 'ansible_pipelining' from source: unknown 30583 1726853682.00517: variable 'ansible_timeout' from source: unknown 30583 1726853682.00519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.00521: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853682.00523: variable 'omit' from source: magic vars 30583 1726853682.00525: starting attempt loop 30583 1726853682.00527: running the handler 30583 1726853682.00545: variable 'lsr_setup' from source: include params 30583 1726853682.00617: variable 'lsr_setup' from source: include params 30583 1726853682.00662: handler run complete 30583 1726853682.00676: attempt loop complete, returning result 30583 1726853682.00694: variable 'item' from source: unknown 30583 1726853682.00751: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 30583 1726853682.00861: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.00864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.00867: variable 'omit' from source: magic vars 30583 1726853682.01122: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.01126: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.01128: variable 'omit' from source: magic vars 30583 1726853682.01131: variable 'omit' from source: magic vars 30583 1726853682.01133: variable 'item' from source: unknown 30583 1726853682.01135: variable 'item' from source: unknown 30583 1726853682.01137: variable 'omit' from source: magic vars 30583 1726853682.01150: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853682.01159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.01166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.01178: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853682.01181: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.01183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.01254: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853682.01262: Set connection var ansible_timeout to 10 30583 1726853682.01265: Set connection var ansible_connection to ssh 30583 1726853682.01270: Set connection var ansible_shell_executable to /bin/sh 30583 1726853682.01274: Set connection var ansible_shell_type to sh 30583 1726853682.01340: Set connection var ansible_pipelining to False 30583 1726853682.01343: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.01345: variable 'ansible_connection' from source: unknown 30583 1726853682.01347: variable 'ansible_module_compression' from source: unknown 30583 1726853682.01349: variable 'ansible_shell_type' from source: unknown 30583 1726853682.01351: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.01353: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.01355: variable 'ansible_pipelining' from source: unknown 30583 1726853682.01357: variable 'ansible_timeout' from source: unknown 30583 1726853682.01359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.01409: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853682.01415: variable 'omit' from source: magic vars 30583 1726853682.01418: starting attempt loop 30583 1726853682.01421: running the handler 30583 1726853682.01447: variable 'lsr_test' from source: include params 30583 1726853682.01508: variable 'lsr_test' from source: include params 30583 1726853682.01556: handler run complete 30583 1726853682.01559: attempt loop complete, returning result 30583 1726853682.01562: variable 'item' from source: unknown 30583 1726853682.01617: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile_no_autoconnect.yml" ] } 30583 1726853682.01815: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.01819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.01822: variable 'omit' from source: magic vars 30583 1726853682.01856: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.01864: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.01868: variable 'omit' from source: magic vars 30583 1726853682.01883: variable 'omit' from source: magic vars 30583 1726853682.01919: variable 'item' from source: unknown 30583 1726853682.01989: variable 'item' from source: unknown 30583 1726853682.01997: variable 'omit' from source: magic vars 30583 1726853682.02075: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853682.02078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.02081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.02083: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853682.02085: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.02087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.02110: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853682.02116: Set connection var ansible_timeout to 10 30583 1726853682.02118: Set connection var ansible_connection to ssh 30583 1726853682.02123: Set connection var ansible_shell_executable to /bin/sh 30583 1726853682.02126: Set connection var ansible_shell_type to sh 30583 1726853682.02135: Set connection var ansible_pipelining to False 30583 1726853682.02161: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.02164: variable 'ansible_connection' from source: unknown 30583 1726853682.02166: variable 'ansible_module_compression' from source: unknown 30583 1726853682.02168: variable 'ansible_shell_type' from source: unknown 30583 1726853682.02172: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.02175: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.02179: variable 'ansible_pipelining' from source: unknown 30583 1726853682.02181: variable 'ansible_timeout' from source: unknown 30583 1726853682.02204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.02268: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853682.02313: variable 'omit' from source: magic vars 30583 1726853682.02321: starting attempt loop 30583 1726853682.02324: running the handler 30583 1726853682.02326: variable 'lsr_assert' from source: include params 30583 1726853682.02354: variable 'lsr_assert' from source: include params 30583 1726853682.02378: handler run complete 30583 1726853682.02392: attempt loop complete, returning result 30583 1726853682.02405: variable 'item' from source: unknown 30583 1726853682.02534: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_absent.yml", "tasks/assert_profile_present.yml" ] } 30583 1726853682.02600: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.02603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.02607: variable 'omit' from source: magic vars 30583 1726853682.02753: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.02778: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.02786: variable 'omit' from source: magic vars 30583 1726853682.02789: variable 'omit' from source: magic vars 30583 1726853682.02987: variable 'item' from source: unknown 30583 1726853682.02990: variable 'item' from source: unknown 30583 1726853682.02992: variable 'omit' from source: magic vars 30583 1726853682.02994: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853682.02996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.02998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.02999: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853682.03001: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.03003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.03293: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853682.03297: Set connection var ansible_timeout to 10 30583 1726853682.03299: Set connection var ansible_connection to ssh 30583 1726853682.03301: Set connection var ansible_shell_executable to /bin/sh 30583 1726853682.03303: Set connection var ansible_shell_type to sh 30583 1726853682.03305: Set connection var ansible_pipelining to False 30583 1726853682.03361: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.03364: variable 'ansible_connection' from source: unknown 30583 1726853682.03367: variable 'ansible_module_compression' from source: unknown 30583 1726853682.03369: variable 'ansible_shell_type' from source: unknown 30583 1726853682.03373: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.03376: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.03377: variable 'ansible_pipelining' from source: unknown 30583 1726853682.03379: variable 'ansible_timeout' from source: unknown 30583 1726853682.03381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.03539: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853682.03546: variable 'omit' from source: magic vars 30583 1726853682.03548: starting attempt loop 30583 1726853682.03551: running the handler 30583 1726853682.03835: handler run complete 30583 1726853682.03846: attempt loop complete, returning result 30583 1726853682.03862: variable 'item' from source: unknown 30583 1726853682.03923: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30583 1726853682.04111: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.04116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.04264: variable 'omit' from source: magic vars 30583 1726853682.04493: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.04501: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.04504: variable 'omit' from source: magic vars 30583 1726853682.04514: variable 'omit' from source: magic vars 30583 1726853682.04551: variable 'item' from source: unknown 30583 1726853682.04733: variable 'item' from source: unknown 30583 1726853682.04747: variable 'omit' from source: magic vars 30583 1726853682.04805: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853682.04808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.04811: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.04813: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853682.04815: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.04817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.04944: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853682.04949: Set connection var ansible_timeout to 10 30583 1726853682.04951: Set connection var ansible_connection to ssh 30583 1726853682.04956: Set connection var ansible_shell_executable to /bin/sh 30583 1726853682.04975: Set connection var ansible_shell_type to sh 30583 1726853682.04977: Set connection var ansible_pipelining to False 30583 1726853682.05151: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.05155: variable 'ansible_connection' from source: unknown 30583 1726853682.05157: variable 'ansible_module_compression' from source: unknown 30583 1726853682.05159: variable 'ansible_shell_type' from source: unknown 30583 1726853682.05162: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.05164: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.05166: variable 'ansible_pipelining' from source: unknown 30583 1726853682.05168: variable 'ansible_timeout' from source: unknown 30583 1726853682.05239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.05299: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853682.05306: variable 'omit' from source: magic vars 30583 1726853682.05309: starting attempt loop 30583 1726853682.05311: running the handler 30583 1726853682.05329: variable 'lsr_fail_debug' from source: play vars 30583 1726853682.05575: variable 'lsr_fail_debug' from source: play vars 30583 1726853682.05580: handler run complete 30583 1726853682.05583: attempt loop complete, returning result 30583 1726853682.05585: variable 'item' from source: unknown 30583 1726853682.05597: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30583 1726853682.05781: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.05790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.05799: variable 'omit' from source: magic vars 30583 1726853682.06176: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.06179: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.06182: variable 'omit' from source: magic vars 30583 1726853682.06187: variable 'omit' from source: magic vars 30583 1726853682.06189: variable 'item' from source: unknown 30583 1726853682.06191: variable 'item' from source: unknown 30583 1726853682.06193: variable 'omit' from source: magic vars 30583 1726853682.06196: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853682.06198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.06200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.06207: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853682.06217: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.06219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.06326: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853682.06330: Set connection var ansible_timeout to 10 30583 1726853682.06332: Set connection var ansible_connection to ssh 30583 1726853682.06334: Set connection var ansible_shell_executable to /bin/sh 30583 1726853682.06337: Set connection var ansible_shell_type to sh 30583 1726853682.06339: Set connection var ansible_pipelining to False 30583 1726853682.06341: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.06343: variable 'ansible_connection' from source: unknown 30583 1726853682.06345: variable 'ansible_module_compression' from source: unknown 30583 1726853682.06347: variable 'ansible_shell_type' from source: unknown 30583 1726853682.06350: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.06352: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.06354: variable 'ansible_pipelining' from source: unknown 30583 1726853682.06355: variable 'ansible_timeout' from source: unknown 30583 1726853682.06357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.06477: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853682.06480: variable 'omit' from source: magic vars 30583 1726853682.06483: starting attempt loop 30583 1726853682.06486: running the handler 30583 1726853682.06488: variable 'lsr_cleanup' from source: include params 30583 1726853682.06674: variable 'lsr_cleanup' from source: include params 30583 1726853682.06679: handler run complete 30583 1726853682.06682: attempt loop complete, returning result 30583 1726853682.06685: variable 'item' from source: unknown 30583 1726853682.06688: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30583 1726853682.06748: dumping result to json 30583 1726853682.06761: done dumping result, returning 30583 1726853682.06764: done running TaskExecutor() for managed_node2/TASK: Show item [02083763-bbaf-05ea-abc5-0000000005b5] 30583 1726853682.06767: sending task result for task 02083763-bbaf-05ea-abc5-0000000005b5 30583 1726853682.06907: done sending task result for task 02083763-bbaf-05ea-abc5-0000000005b5 30583 1726853682.06964: no more pending results, returning what we have 30583 1726853682.06968: results queue empty 30583 1726853682.06969: checking for any_errors_fatal 30583 1726853682.06976: done checking for any_errors_fatal 30583 1726853682.06977: checking for max_fail_percentage 30583 1726853682.06979: done checking for max_fail_percentage 30583 1726853682.06980: checking to see if all hosts have failed and the running result is not ok 30583 1726853682.06981: done checking to see if all hosts have failed 30583 1726853682.06981: getting the remaining hosts for this loop 30583 1726853682.06983: done getting the remaining hosts for this loop 30583 1726853682.06987: getting the next task for host managed_node2 30583 1726853682.06994: done getting next task for host managed_node2 30583 1726853682.06997: ^ task is: TASK: Include the task 'show_interfaces.yml' 30583 1726853682.07001: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853682.07005: getting variables 30583 1726853682.07007: in VariableManager get_vars() 30583 1726853682.07039: Calling all_inventory to load vars for managed_node2 30583 1726853682.07042: Calling groups_inventory to load vars for managed_node2 30583 1726853682.07045: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853682.07061: Calling all_plugins_play to load vars for managed_node2 30583 1726853682.07065: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853682.07070: Calling groups_plugins_play to load vars for managed_node2 30583 1726853682.07813: WORKER PROCESS EXITING 30583 1726853682.09290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853682.12053: done with get_vars() 30583 1726853682.12089: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 13:34:42 -0400 (0:00:00.148) 0:00:17.459 ****** 30583 1726853682.12192: entering _queue_task() for managed_node2/include_tasks 30583 1726853682.12539: worker is 1 (out of 1 available) 30583 1726853682.12550: exiting _queue_task() for managed_node2/include_tasks 30583 1726853682.12564: done queuing things up, now waiting for results queue to drain 30583 1726853682.12565: waiting for pending results... 30583 1726853682.12858: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 30583 1726853682.13059: in run() - task 02083763-bbaf-05ea-abc5-0000000005b6 30583 1726853682.13063: variable 'ansible_search_path' from source: unknown 30583 1726853682.13066: variable 'ansible_search_path' from source: unknown 30583 1726853682.13089: calling self._execute() 30583 1726853682.13209: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.13213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.13224: variable 'omit' from source: magic vars 30583 1726853682.13611: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.13628: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.13634: _execute() done 30583 1726853682.13636: dumping result to json 30583 1726853682.13639: done dumping result, returning 30583 1726853682.13685: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-05ea-abc5-0000000005b6] 30583 1726853682.13688: sending task result for task 02083763-bbaf-05ea-abc5-0000000005b6 30583 1726853682.13760: done sending task result for task 02083763-bbaf-05ea-abc5-0000000005b6 30583 1726853682.13764: WORKER PROCESS EXITING 30583 1726853682.13797: no more pending results, returning what we have 30583 1726853682.13802: in VariableManager get_vars() 30583 1726853682.13841: Calling all_inventory to load vars for managed_node2 30583 1726853682.13844: Calling groups_inventory to load vars for managed_node2 30583 1726853682.13848: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853682.13866: Calling all_plugins_play to load vars for managed_node2 30583 1726853682.13869: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853682.13874: Calling groups_plugins_play to load vars for managed_node2 30583 1726853682.15563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853682.17129: done with get_vars() 30583 1726853682.17151: variable 'ansible_search_path' from source: unknown 30583 1726853682.17153: variable 'ansible_search_path' from source: unknown 30583 1726853682.17198: we have included files to process 30583 1726853682.17199: generating all_blocks data 30583 1726853682.17201: done generating all_blocks data 30583 1726853682.17205: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853682.17206: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853682.17208: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853682.17316: in VariableManager get_vars() 30583 1726853682.17333: done with get_vars() 30583 1726853682.17444: done processing included file 30583 1726853682.17446: iterating over new_blocks loaded from include file 30583 1726853682.17447: in VariableManager get_vars() 30583 1726853682.17463: done with get_vars() 30583 1726853682.17465: filtering new block on tags 30583 1726853682.17503: done filtering new block on tags 30583 1726853682.17506: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 30583 1726853682.17511: extending task lists for all hosts with included blocks 30583 1726853682.17956: done extending task lists 30583 1726853682.17958: done processing included files 30583 1726853682.17959: results queue empty 30583 1726853682.17959: checking for any_errors_fatal 30583 1726853682.17964: done checking for any_errors_fatal 30583 1726853682.17965: checking for max_fail_percentage 30583 1726853682.17966: done checking for max_fail_percentage 30583 1726853682.17967: checking to see if all hosts have failed and the running result is not ok 30583 1726853682.17968: done checking to see if all hosts have failed 30583 1726853682.17968: getting the remaining hosts for this loop 30583 1726853682.17970: done getting the remaining hosts for this loop 30583 1726853682.17974: getting the next task for host managed_node2 30583 1726853682.17978: done getting next task for host managed_node2 30583 1726853682.17980: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30583 1726853682.17984: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853682.17986: getting variables 30583 1726853682.17987: in VariableManager get_vars() 30583 1726853682.17996: Calling all_inventory to load vars for managed_node2 30583 1726853682.17998: Calling groups_inventory to load vars for managed_node2 30583 1726853682.18000: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853682.18006: Calling all_plugins_play to load vars for managed_node2 30583 1726853682.18009: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853682.18011: Calling groups_plugins_play to load vars for managed_node2 30583 1726853682.19167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853682.20715: done with get_vars() 30583 1726853682.20738: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:34:42 -0400 (0:00:00.086) 0:00:17.545 ****** 30583 1726853682.20818: entering _queue_task() for managed_node2/include_tasks 30583 1726853682.21403: worker is 1 (out of 1 available) 30583 1726853682.21411: exiting _queue_task() for managed_node2/include_tasks 30583 1726853682.21422: done queuing things up, now waiting for results queue to drain 30583 1726853682.21423: waiting for pending results... 30583 1726853682.21697: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 30583 1726853682.21702: in run() - task 02083763-bbaf-05ea-abc5-0000000005dd 30583 1726853682.21705: variable 'ansible_search_path' from source: unknown 30583 1726853682.21707: variable 'ansible_search_path' from source: unknown 30583 1726853682.21710: calling self._execute() 30583 1726853682.21741: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.21746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.21763: variable 'omit' from source: magic vars 30583 1726853682.22126: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.22137: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.22143: _execute() done 30583 1726853682.22146: dumping result to json 30583 1726853682.22151: done dumping result, returning 30583 1726853682.22162: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-05ea-abc5-0000000005dd] 30583 1726853682.22166: sending task result for task 02083763-bbaf-05ea-abc5-0000000005dd 30583 1726853682.22273: done sending task result for task 02083763-bbaf-05ea-abc5-0000000005dd 30583 1726853682.22276: WORKER PROCESS EXITING 30583 1726853682.22318: no more pending results, returning what we have 30583 1726853682.22323: in VariableManager get_vars() 30583 1726853682.22361: Calling all_inventory to load vars for managed_node2 30583 1726853682.22364: Calling groups_inventory to load vars for managed_node2 30583 1726853682.22368: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853682.22384: Calling all_plugins_play to load vars for managed_node2 30583 1726853682.22388: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853682.22391: Calling groups_plugins_play to load vars for managed_node2 30583 1726853682.23947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853682.25467: done with get_vars() 30583 1726853682.25491: variable 'ansible_search_path' from source: unknown 30583 1726853682.25492: variable 'ansible_search_path' from source: unknown 30583 1726853682.25526: we have included files to process 30583 1726853682.25527: generating all_blocks data 30583 1726853682.25528: done generating all_blocks data 30583 1726853682.25529: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853682.25530: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853682.25533: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853682.25836: done processing included file 30583 1726853682.25838: iterating over new_blocks loaded from include file 30583 1726853682.25840: in VariableManager get_vars() 30583 1726853682.25854: done with get_vars() 30583 1726853682.25856: filtering new block on tags 30583 1726853682.25891: done filtering new block on tags 30583 1726853682.25894: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 30583 1726853682.25899: extending task lists for all hosts with included blocks 30583 1726853682.26047: done extending task lists 30583 1726853682.26049: done processing included files 30583 1726853682.26049: results queue empty 30583 1726853682.26050: checking for any_errors_fatal 30583 1726853682.26053: done checking for any_errors_fatal 30583 1726853682.26054: checking for max_fail_percentage 30583 1726853682.26059: done checking for max_fail_percentage 30583 1726853682.26060: checking to see if all hosts have failed and the running result is not ok 30583 1726853682.26061: done checking to see if all hosts have failed 30583 1726853682.26062: getting the remaining hosts for this loop 30583 1726853682.26063: done getting the remaining hosts for this loop 30583 1726853682.26065: getting the next task for host managed_node2 30583 1726853682.26070: done getting next task for host managed_node2 30583 1726853682.26073: ^ task is: TASK: Gather current interface info 30583 1726853682.26077: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853682.26079: getting variables 30583 1726853682.26079: in VariableManager get_vars() 30583 1726853682.26087: Calling all_inventory to load vars for managed_node2 30583 1726853682.26089: Calling groups_inventory to load vars for managed_node2 30583 1726853682.26091: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853682.26096: Calling all_plugins_play to load vars for managed_node2 30583 1726853682.26098: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853682.26100: Calling groups_plugins_play to load vars for managed_node2 30583 1726853682.27249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853682.28874: done with get_vars() 30583 1726853682.28901: done getting variables 30583 1726853682.28945: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:34:42 -0400 (0:00:00.081) 0:00:17.627 ****** 30583 1726853682.28982: entering _queue_task() for managed_node2/command 30583 1726853682.29332: worker is 1 (out of 1 available) 30583 1726853682.29345: exiting _queue_task() for managed_node2/command 30583 1726853682.29362: done queuing things up, now waiting for results queue to drain 30583 1726853682.29364: waiting for pending results... 30583 1726853682.29666: running TaskExecutor() for managed_node2/TASK: Gather current interface info 30583 1726853682.29878: in run() - task 02083763-bbaf-05ea-abc5-000000000618 30583 1726853682.29882: variable 'ansible_search_path' from source: unknown 30583 1726853682.29885: variable 'ansible_search_path' from source: unknown 30583 1726853682.29887: calling self._execute() 30583 1726853682.29921: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.29934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.29948: variable 'omit' from source: magic vars 30583 1726853682.30350: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.30368: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.30384: variable 'omit' from source: magic vars 30583 1726853682.30444: variable 'omit' from source: magic vars 30583 1726853682.30487: variable 'omit' from source: magic vars 30583 1726853682.30533: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853682.30660: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853682.30664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853682.30667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.30670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.30695: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853682.30704: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.30713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.30828: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853682.30879: Set connection var ansible_timeout to 10 30583 1726853682.30882: Set connection var ansible_connection to ssh 30583 1726853682.30884: Set connection var ansible_shell_executable to /bin/sh 30583 1726853682.30886: Set connection var ansible_shell_type to sh 30583 1726853682.30888: Set connection var ansible_pipelining to False 30583 1726853682.30911: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.30920: variable 'ansible_connection' from source: unknown 30583 1726853682.30928: variable 'ansible_module_compression' from source: unknown 30583 1726853682.30936: variable 'ansible_shell_type' from source: unknown 30583 1726853682.30976: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.30979: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.30986: variable 'ansible_pipelining' from source: unknown 30583 1726853682.30988: variable 'ansible_timeout' from source: unknown 30583 1726853682.30990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.31123: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853682.31136: variable 'omit' from source: magic vars 30583 1726853682.31145: starting attempt loop 30583 1726853682.31150: running the handler 30583 1726853682.31175: _low_level_execute_command(): starting 30583 1726853682.31202: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853682.31986: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853682.32025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853682.32045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853682.32060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853682.32186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853682.33978: stdout chunk (state=3): >>>/root <<< 30583 1726853682.34051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853682.34087: stderr chunk (state=3): >>><<< 30583 1726853682.34091: stdout chunk (state=3): >>><<< 30583 1726853682.34111: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853682.34123: _low_level_execute_command(): starting 30583 1726853682.34130: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579 `" && echo ansible-tmp-1726853682.341119-31384-99094357785579="` echo /root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579 `" ) && sleep 0' 30583 1726853682.34548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853682.34590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853682.34593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853682.34595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853682.34599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853682.34608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853682.34611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853682.34653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853682.34657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853682.34661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853682.34736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853682.36754: stdout chunk (state=3): >>>ansible-tmp-1726853682.341119-31384-99094357785579=/root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579 <<< 30583 1726853682.36860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853682.36893: stderr chunk (state=3): >>><<< 30583 1726853682.36896: stdout chunk (state=3): >>><<< 30583 1726853682.36909: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853682.341119-31384-99094357785579=/root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853682.36936: variable 'ansible_module_compression' from source: unknown 30583 1726853682.36984: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853682.37025: variable 'ansible_facts' from source: unknown 30583 1726853682.37129: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579/AnsiballZ_command.py 30583 1726853682.37282: Sending initial data 30583 1726853682.37296: Sent initial data (154 bytes) 30583 1726853682.37712: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853682.37715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853682.37718: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853682.37720: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853682.37723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853682.37811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853682.37822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853682.37916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853682.39587: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853682.39652: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853682.39741: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp6hzde7hz /root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579/AnsiballZ_command.py <<< 30583 1726853682.39745: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579/AnsiballZ_command.py" <<< 30583 1726853682.39810: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp6hzde7hz" to remote "/root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579/AnsiballZ_command.py" <<< 30583 1726853682.40733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853682.40736: stdout chunk (state=3): >>><<< 30583 1726853682.40738: stderr chunk (state=3): >>><<< 30583 1726853682.40754: done transferring module to remote 30583 1726853682.40842: _low_level_execute_command(): starting 30583 1726853682.40847: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579/ /root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579/AnsiballZ_command.py && sleep 0' 30583 1726853682.41919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853682.41988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853682.42000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853682.42015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853682.42255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853682.42264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853682.42267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853682.42291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853682.42418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853682.44324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853682.44600: stderr chunk (state=3): >>><<< 30583 1726853682.44607: stdout chunk (state=3): >>><<< 30583 1726853682.44610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853682.44613: _low_level_execute_command(): starting 30583 1726853682.44615: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579/AnsiballZ_command.py && sleep 0' 30583 1726853682.45796: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853682.45994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853682.46039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853682.46043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853682.46122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853682.62095: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:34:42.616374", "end": "2024-09-20 13:34:42.619937", "delta": "0:00:00.003563", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853682.63859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853682.63863: stdout chunk (state=3): >>><<< 30583 1726853682.63866: stderr chunk (state=3): >>><<< 30583 1726853682.63868: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:34:42.616374", "end": "2024-09-20 13:34:42.619937", "delta": "0:00:00.003563", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853682.63877: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853682.63980: _low_level_execute_command(): starting 30583 1726853682.63984: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853682.341119-31384-99094357785579/ > /dev/null 2>&1 && sleep 0' 30583 1726853682.64600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853682.64679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853682.64695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853682.64836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853682.64842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853682.64845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853682.65154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853682.66836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853682.66840: stdout chunk (state=3): >>><<< 30583 1726853682.66842: stderr chunk (state=3): >>><<< 30583 1726853682.66861: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853682.66875: handler run complete 30583 1726853682.67077: Evaluated conditional (False): False 30583 1726853682.67080: attempt loop complete, returning result 30583 1726853682.67083: _execute() done 30583 1726853682.67085: dumping result to json 30583 1726853682.67087: done dumping result, returning 30583 1726853682.67088: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [02083763-bbaf-05ea-abc5-000000000618] 30583 1726853682.67090: sending task result for task 02083763-bbaf-05ea-abc5-000000000618 30583 1726853682.67166: done sending task result for task 02083763-bbaf-05ea-abc5-000000000618 30583 1726853682.67173: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003563", "end": "2024-09-20 13:34:42.619937", "rc": 0, "start": "2024-09-20 13:34:42.616374" } STDOUT: bonding_masters eth0 lo 30583 1726853682.67258: no more pending results, returning what we have 30583 1726853682.67262: results queue empty 30583 1726853682.67264: checking for any_errors_fatal 30583 1726853682.67265: done checking for any_errors_fatal 30583 1726853682.67266: checking for max_fail_percentage 30583 1726853682.67268: done checking for max_fail_percentage 30583 1726853682.67269: checking to see if all hosts have failed and the running result is not ok 30583 1726853682.67270: done checking to see if all hosts have failed 30583 1726853682.67272: getting the remaining hosts for this loop 30583 1726853682.67274: done getting the remaining hosts for this loop 30583 1726853682.67278: getting the next task for host managed_node2 30583 1726853682.67294: done getting next task for host managed_node2 30583 1726853682.67297: ^ task is: TASK: Set current_interfaces 30583 1726853682.67303: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853682.67310: getting variables 30583 1726853682.67312: in VariableManager get_vars() 30583 1726853682.67347: Calling all_inventory to load vars for managed_node2 30583 1726853682.67350: Calling groups_inventory to load vars for managed_node2 30583 1726853682.67355: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853682.67367: Calling all_plugins_play to load vars for managed_node2 30583 1726853682.67377: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853682.67382: Calling groups_plugins_play to load vars for managed_node2 30583 1726853682.68547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853682.70527: done with get_vars() 30583 1726853682.70554: done getting variables 30583 1726853682.70621: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:34:42 -0400 (0:00:00.416) 0:00:18.043 ****** 30583 1726853682.70658: entering _queue_task() for managed_node2/set_fact 30583 1726853682.71034: worker is 1 (out of 1 available) 30583 1726853682.71047: exiting _queue_task() for managed_node2/set_fact 30583 1726853682.71060: done queuing things up, now waiting for results queue to drain 30583 1726853682.71061: waiting for pending results... 30583 1726853682.71595: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 30583 1726853682.71601: in run() - task 02083763-bbaf-05ea-abc5-000000000619 30583 1726853682.71605: variable 'ansible_search_path' from source: unknown 30583 1726853682.71607: variable 'ansible_search_path' from source: unknown 30583 1726853682.71611: calling self._execute() 30583 1726853682.71613: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.71615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.71619: variable 'omit' from source: magic vars 30583 1726853682.72085: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.72105: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.72124: variable 'omit' from source: magic vars 30583 1726853682.72181: variable 'omit' from source: magic vars 30583 1726853682.72294: variable '_current_interfaces' from source: set_fact 30583 1726853682.72374: variable 'omit' from source: magic vars 30583 1726853682.72419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853682.72480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853682.72506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853682.72527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.72542: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.72588: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853682.72672: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.72675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.72728: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853682.72738: Set connection var ansible_timeout to 10 30583 1726853682.72746: Set connection var ansible_connection to ssh 30583 1726853682.72763: Set connection var ansible_shell_executable to /bin/sh 30583 1726853682.72780: Set connection var ansible_shell_type to sh 30583 1726853682.72799: Set connection var ansible_pipelining to False 30583 1726853682.72827: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.72837: variable 'ansible_connection' from source: unknown 30583 1726853682.72843: variable 'ansible_module_compression' from source: unknown 30583 1726853682.72849: variable 'ansible_shell_type' from source: unknown 30583 1726853682.72855: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.72861: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.72868: variable 'ansible_pipelining' from source: unknown 30583 1726853682.72891: variable 'ansible_timeout' from source: unknown 30583 1726853682.72975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.73057: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853682.73077: variable 'omit' from source: magic vars 30583 1726853682.73088: starting attempt loop 30583 1726853682.73104: running the handler 30583 1726853682.73124: handler run complete 30583 1726853682.73140: attempt loop complete, returning result 30583 1726853682.73147: _execute() done 30583 1726853682.73153: dumping result to json 30583 1726853682.73161: done dumping result, returning 30583 1726853682.73176: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [02083763-bbaf-05ea-abc5-000000000619] 30583 1726853682.73187: sending task result for task 02083763-bbaf-05ea-abc5-000000000619 ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30583 1726853682.73384: no more pending results, returning what we have 30583 1726853682.73388: results queue empty 30583 1726853682.73389: checking for any_errors_fatal 30583 1726853682.73398: done checking for any_errors_fatal 30583 1726853682.73399: checking for max_fail_percentage 30583 1726853682.73401: done checking for max_fail_percentage 30583 1726853682.73401: checking to see if all hosts have failed and the running result is not ok 30583 1726853682.73404: done checking to see if all hosts have failed 30583 1726853682.73404: getting the remaining hosts for this loop 30583 1726853682.73406: done getting the remaining hosts for this loop 30583 1726853682.73411: getting the next task for host managed_node2 30583 1726853682.73420: done getting next task for host managed_node2 30583 1726853682.73538: ^ task is: TASK: Show current_interfaces 30583 1726853682.73543: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853682.73547: getting variables 30583 1726853682.73549: in VariableManager get_vars() 30583 1726853682.73585: Calling all_inventory to load vars for managed_node2 30583 1726853682.73592: Calling groups_inventory to load vars for managed_node2 30583 1726853682.73598: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853682.73609: Calling all_plugins_play to load vars for managed_node2 30583 1726853682.73613: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853682.73618: Calling groups_plugins_play to load vars for managed_node2 30583 1726853682.74157: done sending task result for task 02083763-bbaf-05ea-abc5-000000000619 30583 1726853682.74161: WORKER PROCESS EXITING 30583 1726853682.75410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853682.77084: done with get_vars() 30583 1726853682.77116: done getting variables 30583 1726853682.77182: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:34:42 -0400 (0:00:00.065) 0:00:18.109 ****** 30583 1726853682.77222: entering _queue_task() for managed_node2/debug 30583 1726853682.77602: worker is 1 (out of 1 available) 30583 1726853682.77615: exiting _queue_task() for managed_node2/debug 30583 1726853682.77627: done queuing things up, now waiting for results queue to drain 30583 1726853682.77628: waiting for pending results... 30583 1726853682.77944: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 30583 1726853682.78074: in run() - task 02083763-bbaf-05ea-abc5-0000000005de 30583 1726853682.78103: variable 'ansible_search_path' from source: unknown 30583 1726853682.78112: variable 'ansible_search_path' from source: unknown 30583 1726853682.78151: calling self._execute() 30583 1726853682.78252: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.78263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.78278: variable 'omit' from source: magic vars 30583 1726853682.78688: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.78708: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.78726: variable 'omit' from source: magic vars 30583 1726853682.78783: variable 'omit' from source: magic vars 30583 1726853682.78898: variable 'current_interfaces' from source: set_fact 30583 1726853682.78930: variable 'omit' from source: magic vars 30583 1726853682.78984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853682.79030: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853682.79059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853682.79091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.79179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.79183: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853682.79185: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.79188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.79284: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853682.79299: Set connection var ansible_timeout to 10 30583 1726853682.79394: Set connection var ansible_connection to ssh 30583 1726853682.79397: Set connection var ansible_shell_executable to /bin/sh 30583 1726853682.79400: Set connection var ansible_shell_type to sh 30583 1726853682.79402: Set connection var ansible_pipelining to False 30583 1726853682.79404: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.79406: variable 'ansible_connection' from source: unknown 30583 1726853682.79408: variable 'ansible_module_compression' from source: unknown 30583 1726853682.79410: variable 'ansible_shell_type' from source: unknown 30583 1726853682.79412: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.79413: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.79415: variable 'ansible_pipelining' from source: unknown 30583 1726853682.79417: variable 'ansible_timeout' from source: unknown 30583 1726853682.79419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.79564: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853682.79586: variable 'omit' from source: magic vars 30583 1726853682.79610: starting attempt loop 30583 1726853682.79613: running the handler 30583 1726853682.79667: handler run complete 30583 1726853682.79719: attempt loop complete, returning result 30583 1726853682.79722: _execute() done 30583 1726853682.79725: dumping result to json 30583 1726853682.79727: done dumping result, returning 30583 1726853682.79730: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [02083763-bbaf-05ea-abc5-0000000005de] 30583 1726853682.79732: sending task result for task 02083763-bbaf-05ea-abc5-0000000005de ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30583 1726853682.79911: no more pending results, returning what we have 30583 1726853682.79915: results queue empty 30583 1726853682.79917: checking for any_errors_fatal 30583 1726853682.79923: done checking for any_errors_fatal 30583 1726853682.79927: checking for max_fail_percentage 30583 1726853682.79931: done checking for max_fail_percentage 30583 1726853682.79932: checking to see if all hosts have failed and the running result is not ok 30583 1726853682.79933: done checking to see if all hosts have failed 30583 1726853682.79934: getting the remaining hosts for this loop 30583 1726853682.79936: done getting the remaining hosts for this loop 30583 1726853682.79940: getting the next task for host managed_node2 30583 1726853682.79949: done getting next task for host managed_node2 30583 1726853682.79953: ^ task is: TASK: Setup 30583 1726853682.79960: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853682.79965: getting variables 30583 1726853682.79967: in VariableManager get_vars() 30583 1726853682.80000: Calling all_inventory to load vars for managed_node2 30583 1726853682.80003: Calling groups_inventory to load vars for managed_node2 30583 1726853682.80007: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853682.80019: Calling all_plugins_play to load vars for managed_node2 30583 1726853682.80022: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853682.80025: Calling groups_plugins_play to load vars for managed_node2 30583 1726853682.80787: done sending task result for task 02083763-bbaf-05ea-abc5-0000000005de 30583 1726853682.80790: WORKER PROCESS EXITING 30583 1726853682.81704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853682.83345: done with get_vars() 30583 1726853682.83377: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 13:34:42 -0400 (0:00:00.062) 0:00:18.172 ****** 30583 1726853682.83486: entering _queue_task() for managed_node2/include_tasks 30583 1726853682.83975: worker is 1 (out of 1 available) 30583 1726853682.83987: exiting _queue_task() for managed_node2/include_tasks 30583 1726853682.83998: done queuing things up, now waiting for results queue to drain 30583 1726853682.83999: waiting for pending results... 30583 1726853682.84197: running TaskExecutor() for managed_node2/TASK: Setup 30583 1726853682.84311: in run() - task 02083763-bbaf-05ea-abc5-0000000005b7 30583 1726853682.84335: variable 'ansible_search_path' from source: unknown 30583 1726853682.84342: variable 'ansible_search_path' from source: unknown 30583 1726853682.84398: variable 'lsr_setup' from source: include params 30583 1726853682.84619: variable 'lsr_setup' from source: include params 30583 1726853682.84701: variable 'omit' from source: magic vars 30583 1726853682.84856: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.84877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.84892: variable 'omit' from source: magic vars 30583 1726853682.85125: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.85144: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.85157: variable 'item' from source: unknown 30583 1726853682.85237: variable 'item' from source: unknown 30583 1726853682.85288: variable 'item' from source: unknown 30583 1726853682.85351: variable 'item' from source: unknown 30583 1726853682.85778: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.85781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.85784: variable 'omit' from source: magic vars 30583 1726853682.85785: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.85787: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.85789: variable 'item' from source: unknown 30583 1726853682.85824: variable 'item' from source: unknown 30583 1726853682.85857: variable 'item' from source: unknown 30583 1726853682.85926: variable 'item' from source: unknown 30583 1726853682.86015: dumping result to json 30583 1726853682.86076: done dumping result, returning 30583 1726853682.86080: done running TaskExecutor() for managed_node2/TASK: Setup [02083763-bbaf-05ea-abc5-0000000005b7] 30583 1726853682.86082: sending task result for task 02083763-bbaf-05ea-abc5-0000000005b7 30583 1726853682.86148: no more pending results, returning what we have 30583 1726853682.86153: in VariableManager get_vars() 30583 1726853682.86197: Calling all_inventory to load vars for managed_node2 30583 1726853682.86201: Calling groups_inventory to load vars for managed_node2 30583 1726853682.86209: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853682.86223: Calling all_plugins_play to load vars for managed_node2 30583 1726853682.86226: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853682.86229: Calling groups_plugins_play to load vars for managed_node2 30583 1726853682.86891: done sending task result for task 02083763-bbaf-05ea-abc5-0000000005b7 30583 1726853682.86894: WORKER PROCESS EXITING 30583 1726853682.88178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853682.89792: done with get_vars() 30583 1726853682.89819: variable 'ansible_search_path' from source: unknown 30583 1726853682.89820: variable 'ansible_search_path' from source: unknown 30583 1726853682.89863: variable 'ansible_search_path' from source: unknown 30583 1726853682.89864: variable 'ansible_search_path' from source: unknown 30583 1726853682.89897: we have included files to process 30583 1726853682.89898: generating all_blocks data 30583 1726853682.89899: done generating all_blocks data 30583 1726853682.89903: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30583 1726853682.89904: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30583 1726853682.89907: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30583 1726853682.90091: done processing included file 30583 1726853682.90093: iterating over new_blocks loaded from include file 30583 1726853682.90095: in VariableManager get_vars() 30583 1726853682.90110: done with get_vars() 30583 1726853682.90112: filtering new block on tags 30583 1726853682.90140: done filtering new block on tags 30583 1726853682.90143: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 => (item=tasks/delete_interface.yml) 30583 1726853682.90148: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853682.90149: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853682.90151: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853682.90231: in VariableManager get_vars() 30583 1726853682.90254: done with get_vars() 30583 1726853682.90337: done processing included file 30583 1726853682.90339: iterating over new_blocks loaded from include file 30583 1726853682.90340: in VariableManager get_vars() 30583 1726853682.90357: done with get_vars() 30583 1726853682.90359: filtering new block on tags 30583 1726853682.90390: done filtering new block on tags 30583 1726853682.90392: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item=tasks/assert_device_absent.yml) 30583 1726853682.90395: extending task lists for all hosts with included blocks 30583 1726853682.91006: done extending task lists 30583 1726853682.91007: done processing included files 30583 1726853682.91008: results queue empty 30583 1726853682.91008: checking for any_errors_fatal 30583 1726853682.91011: done checking for any_errors_fatal 30583 1726853682.91012: checking for max_fail_percentage 30583 1726853682.91013: done checking for max_fail_percentage 30583 1726853682.91014: checking to see if all hosts have failed and the running result is not ok 30583 1726853682.91015: done checking to see if all hosts have failed 30583 1726853682.91016: getting the remaining hosts for this loop 30583 1726853682.91017: done getting the remaining hosts for this loop 30583 1726853682.91019: getting the next task for host managed_node2 30583 1726853682.91023: done getting next task for host managed_node2 30583 1726853682.91025: ^ task is: TASK: Remove test interface if necessary 30583 1726853682.91028: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853682.91030: getting variables 30583 1726853682.91032: in VariableManager get_vars() 30583 1726853682.91046: Calling all_inventory to load vars for managed_node2 30583 1726853682.91048: Calling groups_inventory to load vars for managed_node2 30583 1726853682.91050: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853682.91055: Calling all_plugins_play to load vars for managed_node2 30583 1726853682.91058: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853682.91061: Calling groups_plugins_play to load vars for managed_node2 30583 1726853682.92230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853682.94306: done with get_vars() 30583 1726853682.94328: done getting variables 30583 1726853682.94379: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 13:34:42 -0400 (0:00:00.109) 0:00:18.281 ****** 30583 1726853682.94418: entering _queue_task() for managed_node2/command 30583 1726853682.94761: worker is 1 (out of 1 available) 30583 1726853682.94775: exiting _queue_task() for managed_node2/command 30583 1726853682.94789: done queuing things up, now waiting for results queue to drain 30583 1726853682.94790: waiting for pending results... 30583 1726853682.95265: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 30583 1726853682.95425: in run() - task 02083763-bbaf-05ea-abc5-00000000063e 30583 1726853682.95437: variable 'ansible_search_path' from source: unknown 30583 1726853682.95441: variable 'ansible_search_path' from source: unknown 30583 1726853682.95485: calling self._execute() 30583 1726853682.95649: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.95653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.95774: variable 'omit' from source: magic vars 30583 1726853682.96263: variable 'ansible_distribution_major_version' from source: facts 30583 1726853682.96276: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853682.96308: variable 'omit' from source: magic vars 30583 1726853682.96368: variable 'omit' from source: magic vars 30583 1726853682.96466: variable 'interface' from source: play vars 30583 1726853682.96512: variable 'omit' from source: magic vars 30583 1726853682.96590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853682.96631: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853682.96762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853682.96765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.96767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853682.96890: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853682.96906: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.96909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.97097: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853682.97108: Set connection var ansible_timeout to 10 30583 1726853682.97111: Set connection var ansible_connection to ssh 30583 1726853682.97185: Set connection var ansible_shell_executable to /bin/sh 30583 1726853682.97192: Set connection var ansible_shell_type to sh 30583 1726853682.97200: Set connection var ansible_pipelining to False 30583 1726853682.97221: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.97321: variable 'ansible_connection' from source: unknown 30583 1726853682.97325: variable 'ansible_module_compression' from source: unknown 30583 1726853682.97327: variable 'ansible_shell_type' from source: unknown 30583 1726853682.97376: variable 'ansible_shell_executable' from source: unknown 30583 1726853682.97380: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853682.97440: variable 'ansible_pipelining' from source: unknown 30583 1726853682.97444: variable 'ansible_timeout' from source: unknown 30583 1726853682.97453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853682.97788: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853682.97798: variable 'omit' from source: magic vars 30583 1726853682.97804: starting attempt loop 30583 1726853682.97807: running the handler 30583 1726853682.97847: _low_level_execute_command(): starting 30583 1726853682.97853: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853682.99155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853682.99237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853682.99277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853682.99351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853683.01115: stdout chunk (state=3): >>>/root <<< 30583 1726853683.01244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853683.01248: stdout chunk (state=3): >>><<< 30583 1726853683.01250: stderr chunk (state=3): >>><<< 30583 1726853683.01374: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853683.01378: _low_level_execute_command(): starting 30583 1726853683.01382: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545 `" && echo ansible-tmp-1726853683.0128267-31414-104465326265545="` echo /root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545 `" ) && sleep 0' 30583 1726853683.02242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853683.02246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853683.02316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853683.02330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.02335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853683.02347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.02518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853683.02522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853683.02585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853683.02692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853683.04724: stdout chunk (state=3): >>>ansible-tmp-1726853683.0128267-31414-104465326265545=/root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545 <<< 30583 1726853683.04828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853683.04857: stderr chunk (state=3): >>><<< 30583 1726853683.04861: stdout chunk (state=3): >>><<< 30583 1726853683.04880: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853683.0128267-31414-104465326265545=/root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853683.04920: variable 'ansible_module_compression' from source: unknown 30583 1726853683.04974: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853683.05016: variable 'ansible_facts' from source: unknown 30583 1726853683.05099: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545/AnsiballZ_command.py 30583 1726853683.05206: Sending initial data 30583 1726853683.05211: Sent initial data (156 bytes) 30583 1726853683.05657: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853683.05661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853683.05663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853683.05665: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853683.05667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.05741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853683.05744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853683.05757: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853683.05821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853683.07508: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853683.07564: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853683.07639: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmptyu3ogj6 /root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545/AnsiballZ_command.py <<< 30583 1726853683.07642: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545/AnsiballZ_command.py" <<< 30583 1726853683.07707: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmptyu3ogj6" to remote "/root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545/AnsiballZ_command.py" <<< 30583 1726853683.08745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853683.08748: stdout chunk (state=3): >>><<< 30583 1726853683.08751: stderr chunk (state=3): >>><<< 30583 1726853683.08753: done transferring module to remote 30583 1726853683.08789: _low_level_execute_command(): starting 30583 1726853683.08801: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545/ /root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545/AnsiballZ_command.py && sleep 0' 30583 1726853683.10264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.10274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853683.10336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853683.10609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853683.12339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853683.12343: stdout chunk (state=3): >>><<< 30583 1726853683.12349: stderr chunk (state=3): >>><<< 30583 1726853683.12370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853683.12376: _low_level_execute_command(): starting 30583 1726853683.12379: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545/AnsiballZ_command.py && sleep 0' 30583 1726853683.13434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853683.13588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853683.13685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853683.13691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853683.13694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853683.13697: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853683.13699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.13701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853683.13703: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853683.13705: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853683.13707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853683.13710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853683.13712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853683.13713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853683.13715: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853683.13717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.13779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853683.14203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853683.14345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853683.30700: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 13:34:43.299256", "end": "2024-09-20 13:34:43.305914", "delta": "0:00:00.006658", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853683.32478: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. <<< 30583 1726853683.32482: stdout chunk (state=3): >>><<< 30583 1726853683.32485: stderr chunk (state=3): >>><<< 30583 1726853683.32488: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 13:34:43.299256", "end": "2024-09-20 13:34:43.305914", "delta": "0:00:00.006658", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. 30583 1726853683.32491: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853683.32493: _low_level_execute_command(): starting 30583 1726853683.32495: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853683.0128267-31414-104465326265545/ > /dev/null 2>&1 && sleep 0' 30583 1726853683.33007: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853683.33016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853683.33027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853683.33097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853683.33100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853683.33103: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853683.33193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.33207: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853683.33215: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853683.33268: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853683.33277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853683.33280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853683.33282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853683.33284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853683.33285: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853683.33287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.33407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853683.33411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853683.33628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853683.35413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853683.35424: stdout chunk (state=3): >>><<< 30583 1726853683.35437: stderr chunk (state=3): >>><<< 30583 1726853683.35458: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853683.35473: handler run complete 30583 1726853683.35502: Evaluated conditional (False): False 30583 1726853683.35518: attempt loop complete, returning result 30583 1726853683.35526: _execute() done 30583 1726853683.35533: dumping result to json 30583 1726853683.35543: done dumping result, returning 30583 1726853683.35556: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [02083763-bbaf-05ea-abc5-00000000063e] 30583 1726853683.35565: sending task result for task 02083763-bbaf-05ea-abc5-00000000063e 30583 1726853683.35686: done sending task result for task 02083763-bbaf-05ea-abc5-00000000063e fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.006658", "end": "2024-09-20 13:34:43.305914", "rc": 1, "start": "2024-09-20 13:34:43.299256" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30583 1726853683.35751: no more pending results, returning what we have 30583 1726853683.35758: results queue empty 30583 1726853683.35759: checking for any_errors_fatal 30583 1726853683.35760: done checking for any_errors_fatal 30583 1726853683.35761: checking for max_fail_percentage 30583 1726853683.35763: done checking for max_fail_percentage 30583 1726853683.35763: checking to see if all hosts have failed and the running result is not ok 30583 1726853683.35764: done checking to see if all hosts have failed 30583 1726853683.35765: getting the remaining hosts for this loop 30583 1726853683.35766: done getting the remaining hosts for this loop 30583 1726853683.35770: getting the next task for host managed_node2 30583 1726853683.35781: done getting next task for host managed_node2 30583 1726853683.35784: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30583 1726853683.35794: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853683.35799: getting variables 30583 1726853683.35801: in VariableManager get_vars() 30583 1726853683.35832: Calling all_inventory to load vars for managed_node2 30583 1726853683.35835: Calling groups_inventory to load vars for managed_node2 30583 1726853683.35838: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853683.35850: Calling all_plugins_play to load vars for managed_node2 30583 1726853683.35852: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853683.35857: Calling groups_plugins_play to load vars for managed_node2 30583 1726853683.36686: WORKER PROCESS EXITING 30583 1726853683.37368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853683.38894: done with get_vars() 30583 1726853683.38918: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:34:43 -0400 (0:00:00.445) 0:00:18.727 ****** 30583 1726853683.39026: entering _queue_task() for managed_node2/include_tasks 30583 1726853683.39650: worker is 1 (out of 1 available) 30583 1726853683.39663: exiting _queue_task() for managed_node2/include_tasks 30583 1726853683.39879: done queuing things up, now waiting for results queue to drain 30583 1726853683.39881: waiting for pending results... 30583 1726853683.39953: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30583 1726853683.40086: in run() - task 02083763-bbaf-05ea-abc5-000000000642 30583 1726853683.40115: variable 'ansible_search_path' from source: unknown 30583 1726853683.40124: variable 'ansible_search_path' from source: unknown 30583 1726853683.40164: calling self._execute() 30583 1726853683.40262: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853683.40276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853683.40291: variable 'omit' from source: magic vars 30583 1726853683.40711: variable 'ansible_distribution_major_version' from source: facts 30583 1726853683.40729: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853683.40744: _execute() done 30583 1726853683.40753: dumping result to json 30583 1726853683.40765: done dumping result, returning 30583 1726853683.40856: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-05ea-abc5-000000000642] 30583 1726853683.40859: sending task result for task 02083763-bbaf-05ea-abc5-000000000642 30583 1726853683.41106: no more pending results, returning what we have 30583 1726853683.41112: in VariableManager get_vars() 30583 1726853683.41158: Calling all_inventory to load vars for managed_node2 30583 1726853683.41161: Calling groups_inventory to load vars for managed_node2 30583 1726853683.41165: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853683.41182: Calling all_plugins_play to load vars for managed_node2 30583 1726853683.41186: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853683.41190: Calling groups_plugins_play to load vars for managed_node2 30583 1726853683.41797: done sending task result for task 02083763-bbaf-05ea-abc5-000000000642 30583 1726853683.41801: WORKER PROCESS EXITING 30583 1726853683.48437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853683.49967: done with get_vars() 30583 1726853683.49991: variable 'ansible_search_path' from source: unknown 30583 1726853683.49993: variable 'ansible_search_path' from source: unknown 30583 1726853683.50002: variable 'item' from source: include params 30583 1726853683.50090: variable 'item' from source: include params 30583 1726853683.50122: we have included files to process 30583 1726853683.50123: generating all_blocks data 30583 1726853683.50124: done generating all_blocks data 30583 1726853683.50126: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853683.50127: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853683.50129: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853683.50287: done processing included file 30583 1726853683.50289: iterating over new_blocks loaded from include file 30583 1726853683.50290: in VariableManager get_vars() 30583 1726853683.50305: done with get_vars() 30583 1726853683.50306: filtering new block on tags 30583 1726853683.50332: done filtering new block on tags 30583 1726853683.50334: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30583 1726853683.50339: extending task lists for all hosts with included blocks 30583 1726853683.50486: done extending task lists 30583 1726853683.50488: done processing included files 30583 1726853683.50488: results queue empty 30583 1726853683.50489: checking for any_errors_fatal 30583 1726853683.50493: done checking for any_errors_fatal 30583 1726853683.50494: checking for max_fail_percentage 30583 1726853683.50495: done checking for max_fail_percentage 30583 1726853683.50495: checking to see if all hosts have failed and the running result is not ok 30583 1726853683.50496: done checking to see if all hosts have failed 30583 1726853683.50497: getting the remaining hosts for this loop 30583 1726853683.50498: done getting the remaining hosts for this loop 30583 1726853683.50500: getting the next task for host managed_node2 30583 1726853683.50504: done getting next task for host managed_node2 30583 1726853683.50506: ^ task is: TASK: Get stat for interface {{ interface }} 30583 1726853683.50509: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853683.50511: getting variables 30583 1726853683.50512: in VariableManager get_vars() 30583 1726853683.50521: Calling all_inventory to load vars for managed_node2 30583 1726853683.50523: Calling groups_inventory to load vars for managed_node2 30583 1726853683.50525: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853683.50530: Calling all_plugins_play to load vars for managed_node2 30583 1726853683.50556: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853683.50560: Calling groups_plugins_play to load vars for managed_node2 30583 1726853683.52418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853683.55104: done with get_vars() 30583 1726853683.55139: done getting variables 30583 1726853683.55282: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:34:43 -0400 (0:00:00.162) 0:00:18.890 ****** 30583 1726853683.55310: entering _queue_task() for managed_node2/stat 30583 1726853683.55680: worker is 1 (out of 1 available) 30583 1726853683.55693: exiting _queue_task() for managed_node2/stat 30583 1726853683.55709: done queuing things up, now waiting for results queue to drain 30583 1726853683.55711: waiting for pending results... 30583 1726853683.55949: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30583 1726853683.56099: in run() - task 02083763-bbaf-05ea-abc5-000000000691 30583 1726853683.56123: variable 'ansible_search_path' from source: unknown 30583 1726853683.56130: variable 'ansible_search_path' from source: unknown 30583 1726853683.56206: calling self._execute() 30583 1726853683.56262: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853683.56277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853683.56290: variable 'omit' from source: magic vars 30583 1726853683.56950: variable 'ansible_distribution_major_version' from source: facts 30583 1726853683.56973: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853683.56985: variable 'omit' from source: magic vars 30583 1726853683.57054: variable 'omit' from source: magic vars 30583 1726853683.57149: variable 'interface' from source: play vars 30583 1726853683.57479: variable 'omit' from source: magic vars 30583 1726853683.57483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853683.57486: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853683.57505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853683.57525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853683.57597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853683.57629: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853683.57977: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853683.57981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853683.57984: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853683.57986: Set connection var ansible_timeout to 10 30583 1726853683.57988: Set connection var ansible_connection to ssh 30583 1726853683.57990: Set connection var ansible_shell_executable to /bin/sh 30583 1726853683.57992: Set connection var ansible_shell_type to sh 30583 1726853683.57994: Set connection var ansible_pipelining to False 30583 1726853683.57996: variable 'ansible_shell_executable' from source: unknown 30583 1726853683.57998: variable 'ansible_connection' from source: unknown 30583 1726853683.58001: variable 'ansible_module_compression' from source: unknown 30583 1726853683.58003: variable 'ansible_shell_type' from source: unknown 30583 1726853683.58005: variable 'ansible_shell_executable' from source: unknown 30583 1726853683.58007: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853683.58008: variable 'ansible_pipelining' from source: unknown 30583 1726853683.58010: variable 'ansible_timeout' from source: unknown 30583 1726853683.58012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853683.58542: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853683.58591: variable 'omit' from source: magic vars 30583 1726853683.58624: starting attempt loop 30583 1726853683.58641: running the handler 30583 1726853683.58702: _low_level_execute_command(): starting 30583 1726853683.58714: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853683.60104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.60145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853683.60160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853683.60387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853683.62198: stdout chunk (state=3): >>>/root <<< 30583 1726853683.62381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853683.62394: stdout chunk (state=3): >>><<< 30583 1726853683.62474: stderr chunk (state=3): >>><<< 30583 1726853683.62516: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853683.62777: _low_level_execute_command(): starting 30583 1726853683.62792: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809 `" && echo ansible-tmp-1726853683.6258492-31453-269557100067809="` echo /root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809 `" ) && sleep 0' 30583 1726853683.64405: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853683.64426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853683.64499: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.64705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853683.64786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853683.64904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853683.66964: stdout chunk (state=3): >>>ansible-tmp-1726853683.6258492-31453-269557100067809=/root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809 <<< 30583 1726853683.67101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853683.67126: stderr chunk (state=3): >>><<< 30583 1726853683.67129: stdout chunk (state=3): >>><<< 30583 1726853683.67147: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853683.6258492-31453-269557100067809=/root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853683.67279: variable 'ansible_module_compression' from source: unknown 30583 1726853683.67282: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853683.67313: variable 'ansible_facts' from source: unknown 30583 1726853683.67419: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809/AnsiballZ_stat.py 30583 1726853683.67592: Sending initial data 30583 1726853683.67606: Sent initial data (153 bytes) 30583 1726853683.68213: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853683.68265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853683.68282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.68353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853683.68385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853683.68487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853683.70278: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853683.70380: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853683.70542: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp3smi7jcw /root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809/AnsiballZ_stat.py <<< 30583 1726853683.70546: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809/AnsiballZ_stat.py" <<< 30583 1726853683.70588: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp3smi7jcw" to remote "/root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809/AnsiballZ_stat.py" <<< 30583 1726853683.71694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853683.71698: stderr chunk (state=3): >>><<< 30583 1726853683.71793: stdout chunk (state=3): >>><<< 30583 1726853683.71797: done transferring module to remote 30583 1726853683.71800: _low_level_execute_command(): starting 30583 1726853683.71802: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809/ /root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809/AnsiballZ_stat.py && sleep 0' 30583 1726853683.72409: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853683.72414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853683.72442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.72446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853683.72448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853683.72451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.72508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853683.72511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853683.72589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853683.74787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853683.74791: stdout chunk (state=3): >>><<< 30583 1726853683.74794: stderr chunk (state=3): >>><<< 30583 1726853683.75077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853683.75081: _low_level_execute_command(): starting 30583 1726853683.75084: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809/AnsiballZ_stat.py && sleep 0' 30583 1726853683.75931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853683.75935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853683.75960: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853683.75963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.76011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853683.76023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853683.76117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853683.76160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853683.91789: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853683.93269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853683.93325: stderr chunk (state=3): >>><<< 30583 1726853683.93335: stdout chunk (state=3): >>><<< 30583 1726853683.93376: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853683.93463: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853683.93488: _low_level_execute_command(): starting 30583 1726853683.93496: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853683.6258492-31453-269557100067809/ > /dev/null 2>&1 && sleep 0' 30583 1726853683.94784: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853683.94788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853683.94791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853683.94793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853683.94795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853683.94797: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853683.94799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853683.94800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853683.94802: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853683.94957: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853683.95092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853683.95245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853683.97277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853683.97281: stdout chunk (state=3): >>><<< 30583 1726853683.97284: stderr chunk (state=3): >>><<< 30583 1726853683.97286: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853683.97288: handler run complete 30583 1726853683.97290: attempt loop complete, returning result 30583 1726853683.97292: _execute() done 30583 1726853683.97294: dumping result to json 30583 1726853683.97296: done dumping result, returning 30583 1726853683.97298: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [02083763-bbaf-05ea-abc5-000000000691] 30583 1726853683.97301: sending task result for task 02083763-bbaf-05ea-abc5-000000000691 30583 1726853683.97617: done sending task result for task 02083763-bbaf-05ea-abc5-000000000691 30583 1726853683.97620: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30583 1726853683.97703: no more pending results, returning what we have 30583 1726853683.97707: results queue empty 30583 1726853683.97709: checking for any_errors_fatal 30583 1726853683.97710: done checking for any_errors_fatal 30583 1726853683.97711: checking for max_fail_percentage 30583 1726853683.97712: done checking for max_fail_percentage 30583 1726853683.97713: checking to see if all hosts have failed and the running result is not ok 30583 1726853683.97714: done checking to see if all hosts have failed 30583 1726853683.97715: getting the remaining hosts for this loop 30583 1726853683.97717: done getting the remaining hosts for this loop 30583 1726853683.97721: getting the next task for host managed_node2 30583 1726853683.97736: done getting next task for host managed_node2 30583 1726853683.97740: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30583 1726853683.97744: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853683.97749: getting variables 30583 1726853683.97751: in VariableManager get_vars() 30583 1726853683.97795: Calling all_inventory to load vars for managed_node2 30583 1726853683.97798: Calling groups_inventory to load vars for managed_node2 30583 1726853683.97801: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853683.97814: Calling all_plugins_play to load vars for managed_node2 30583 1726853683.97816: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853683.97819: Calling groups_plugins_play to load vars for managed_node2 30583 1726853683.99452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853684.01102: done with get_vars() 30583 1726853684.01132: done getting variables 30583 1726853684.01200: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853684.01343: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:34:44 -0400 (0:00:00.460) 0:00:19.351 ****** 30583 1726853684.01380: entering _queue_task() for managed_node2/assert 30583 1726853684.01729: worker is 1 (out of 1 available) 30583 1726853684.01742: exiting _queue_task() for managed_node2/assert 30583 1726853684.01754: done queuing things up, now waiting for results queue to drain 30583 1726853684.01758: waiting for pending results... 30583 1726853684.02039: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 30583 1726853684.02111: in run() - task 02083763-bbaf-05ea-abc5-000000000643 30583 1726853684.02138: variable 'ansible_search_path' from source: unknown 30583 1726853684.02147: variable 'ansible_search_path' from source: unknown 30583 1726853684.02191: calling self._execute() 30583 1726853684.02298: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853684.02352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853684.02355: variable 'omit' from source: magic vars 30583 1726853684.02808: variable 'ansible_distribution_major_version' from source: facts 30583 1726853684.02812: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853684.02814: variable 'omit' from source: magic vars 30583 1726853684.02867: variable 'omit' from source: magic vars 30583 1726853684.03021: variable 'interface' from source: play vars 30583 1726853684.03024: variable 'omit' from source: magic vars 30583 1726853684.03027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853684.03067: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853684.03093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853684.03116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853684.03124: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853684.03153: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853684.03156: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853684.03162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853684.03273: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853684.03334: Set connection var ansible_timeout to 10 30583 1726853684.03337: Set connection var ansible_connection to ssh 30583 1726853684.03340: Set connection var ansible_shell_executable to /bin/sh 30583 1726853684.03346: Set connection var ansible_shell_type to sh 30583 1726853684.03349: Set connection var ansible_pipelining to False 30583 1726853684.03351: variable 'ansible_shell_executable' from source: unknown 30583 1726853684.03353: variable 'ansible_connection' from source: unknown 30583 1726853684.03355: variable 'ansible_module_compression' from source: unknown 30583 1726853684.03357: variable 'ansible_shell_type' from source: unknown 30583 1726853684.03359: variable 'ansible_shell_executable' from source: unknown 30583 1726853684.03361: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853684.03363: variable 'ansible_pipelining' from source: unknown 30583 1726853684.03365: variable 'ansible_timeout' from source: unknown 30583 1726853684.03366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853684.03503: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853684.03551: variable 'omit' from source: magic vars 30583 1726853684.03555: starting attempt loop 30583 1726853684.03562: running the handler 30583 1726853684.03677: variable 'interface_stat' from source: set_fact 30583 1726853684.03686: Evaluated conditional (not interface_stat.stat.exists): True 30583 1726853684.03770: handler run complete 30583 1726853684.03775: attempt loop complete, returning result 30583 1726853684.03777: _execute() done 30583 1726853684.03778: dumping result to json 30583 1726853684.03780: done dumping result, returning 30583 1726853684.03782: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [02083763-bbaf-05ea-abc5-000000000643] 30583 1726853684.03784: sending task result for task 02083763-bbaf-05ea-abc5-000000000643 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853684.03897: no more pending results, returning what we have 30583 1726853684.03902: results queue empty 30583 1726853684.03903: checking for any_errors_fatal 30583 1726853684.03913: done checking for any_errors_fatal 30583 1726853684.03914: checking for max_fail_percentage 30583 1726853684.03916: done checking for max_fail_percentage 30583 1726853684.03917: checking to see if all hosts have failed and the running result is not ok 30583 1726853684.03917: done checking to see if all hosts have failed 30583 1726853684.03918: getting the remaining hosts for this loop 30583 1726853684.03920: done getting the remaining hosts for this loop 30583 1726853684.03924: getting the next task for host managed_node2 30583 1726853684.03934: done getting next task for host managed_node2 30583 1726853684.03937: ^ task is: TASK: Test 30583 1726853684.03941: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853684.03947: getting variables 30583 1726853684.03949: in VariableManager get_vars() 30583 1726853684.03986: Calling all_inventory to load vars for managed_node2 30583 1726853684.03989: Calling groups_inventory to load vars for managed_node2 30583 1726853684.03994: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853684.04006: Calling all_plugins_play to load vars for managed_node2 30583 1726853684.04009: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853684.04012: Calling groups_plugins_play to load vars for managed_node2 30583 1726853684.04606: done sending task result for task 02083763-bbaf-05ea-abc5-000000000643 30583 1726853684.04610: WORKER PROCESS EXITING 30583 1726853684.05617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853684.07244: done with get_vars() 30583 1726853684.07277: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 13:34:44 -0400 (0:00:00.059) 0:00:19.411 ****** 30583 1726853684.07382: entering _queue_task() for managed_node2/include_tasks 30583 1726853684.07747: worker is 1 (out of 1 available) 30583 1726853684.07879: exiting _queue_task() for managed_node2/include_tasks 30583 1726853684.07890: done queuing things up, now waiting for results queue to drain 30583 1726853684.07892: waiting for pending results... 30583 1726853684.08106: running TaskExecutor() for managed_node2/TASK: Test 30583 1726853684.08241: in run() - task 02083763-bbaf-05ea-abc5-0000000005b8 30583 1726853684.08262: variable 'ansible_search_path' from source: unknown 30583 1726853684.08307: variable 'ansible_search_path' from source: unknown 30583 1726853684.08337: variable 'lsr_test' from source: include params 30583 1726853684.08567: variable 'lsr_test' from source: include params 30583 1726853684.08646: variable 'omit' from source: magic vars 30583 1726853684.08805: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853684.08851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853684.08854: variable 'omit' from source: magic vars 30583 1726853684.09091: variable 'ansible_distribution_major_version' from source: facts 30583 1726853684.09104: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853684.09113: variable 'item' from source: unknown 30583 1726853684.09179: variable 'item' from source: unknown 30583 1726853684.09289: variable 'item' from source: unknown 30583 1726853684.09292: variable 'item' from source: unknown 30583 1726853684.09489: dumping result to json 30583 1726853684.09492: done dumping result, returning 30583 1726853684.09494: done running TaskExecutor() for managed_node2/TASK: Test [02083763-bbaf-05ea-abc5-0000000005b8] 30583 1726853684.09504: sending task result for task 02083763-bbaf-05ea-abc5-0000000005b8 30583 1726853684.09545: done sending task result for task 02083763-bbaf-05ea-abc5-0000000005b8 30583 1726853684.09549: WORKER PROCESS EXITING 30583 1726853684.09621: no more pending results, returning what we have 30583 1726853684.09627: in VariableManager get_vars() 30583 1726853684.09666: Calling all_inventory to load vars for managed_node2 30583 1726853684.09669: Calling groups_inventory to load vars for managed_node2 30583 1726853684.09675: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853684.09688: Calling all_plugins_play to load vars for managed_node2 30583 1726853684.09691: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853684.09695: Calling groups_plugins_play to load vars for managed_node2 30583 1726853684.11447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853684.13093: done with get_vars() 30583 1726853684.13120: variable 'ansible_search_path' from source: unknown 30583 1726853684.13122: variable 'ansible_search_path' from source: unknown 30583 1726853684.13163: we have included files to process 30583 1726853684.13164: generating all_blocks data 30583 1726853684.13166: done generating all_blocks data 30583 1726853684.13173: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30583 1726853684.13174: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30583 1726853684.13177: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30583 1726853684.13523: done processing included file 30583 1726853684.13525: iterating over new_blocks loaded from include file 30583 1726853684.13527: in VariableManager get_vars() 30583 1726853684.13542: done with get_vars() 30583 1726853684.13544: filtering new block on tags 30583 1726853684.13579: done filtering new block on tags 30583 1726853684.13582: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml for managed_node2 => (item=tasks/create_bridge_profile_no_autoconnect.yml) 30583 1726853684.13588: extending task lists for all hosts with included blocks 30583 1726853684.14439: done extending task lists 30583 1726853684.14441: done processing included files 30583 1726853684.14441: results queue empty 30583 1726853684.14442: checking for any_errors_fatal 30583 1726853684.14445: done checking for any_errors_fatal 30583 1726853684.14446: checking for max_fail_percentage 30583 1726853684.14447: done checking for max_fail_percentage 30583 1726853684.14448: checking to see if all hosts have failed and the running result is not ok 30583 1726853684.14449: done checking to see if all hosts have failed 30583 1726853684.14450: getting the remaining hosts for this loop 30583 1726853684.14451: done getting the remaining hosts for this loop 30583 1726853684.14454: getting the next task for host managed_node2 30583 1726853684.14458: done getting next task for host managed_node2 30583 1726853684.14460: ^ task is: TASK: Include network role 30583 1726853684.14462: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853684.14465: getting variables 30583 1726853684.14466: in VariableManager get_vars() 30583 1726853684.14484: Calling all_inventory to load vars for managed_node2 30583 1726853684.14487: Calling groups_inventory to load vars for managed_node2 30583 1726853684.14489: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853684.14494: Calling all_plugins_play to load vars for managed_node2 30583 1726853684.14497: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853684.14500: Calling groups_plugins_play to load vars for managed_node2 30583 1726853684.15799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853684.17387: done with get_vars() 30583 1726853684.17418: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:3 Friday 20 September 2024 13:34:44 -0400 (0:00:00.101) 0:00:19.512 ****** 30583 1726853684.17516: entering _queue_task() for managed_node2/include_role 30583 1726853684.18096: worker is 1 (out of 1 available) 30583 1726853684.18107: exiting _queue_task() for managed_node2/include_role 30583 1726853684.18118: done queuing things up, now waiting for results queue to drain 30583 1726853684.18120: waiting for pending results... 30583 1726853684.18363: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853684.18395: in run() - task 02083763-bbaf-05ea-abc5-0000000006b1 30583 1726853684.18416: variable 'ansible_search_path' from source: unknown 30583 1726853684.18424: variable 'ansible_search_path' from source: unknown 30583 1726853684.18477: calling self._execute() 30583 1726853684.18588: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853684.18600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853684.18615: variable 'omit' from source: magic vars 30583 1726853684.19436: variable 'ansible_distribution_major_version' from source: facts 30583 1726853684.19440: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853684.19443: _execute() done 30583 1726853684.19446: dumping result to json 30583 1726853684.19454: done dumping result, returning 30583 1726853684.19506: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-0000000006b1] 30583 1726853684.19517: sending task result for task 02083763-bbaf-05ea-abc5-0000000006b1 30583 1726853684.19854: no more pending results, returning what we have 30583 1726853684.19859: in VariableManager get_vars() 30583 1726853684.20058: Calling all_inventory to load vars for managed_node2 30583 1726853684.20062: Calling groups_inventory to load vars for managed_node2 30583 1726853684.20067: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853684.20084: Calling all_plugins_play to load vars for managed_node2 30583 1726853684.20087: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853684.20090: Calling groups_plugins_play to load vars for managed_node2 30583 1726853684.20940: done sending task result for task 02083763-bbaf-05ea-abc5-0000000006b1 30583 1726853684.20943: WORKER PROCESS EXITING 30583 1726853684.22941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853684.25769: done with get_vars() 30583 1726853684.25801: variable 'ansible_search_path' from source: unknown 30583 1726853684.25802: variable 'ansible_search_path' from source: unknown 30583 1726853684.26193: variable 'omit' from source: magic vars 30583 1726853684.26310: variable 'omit' from source: magic vars 30583 1726853684.26335: variable 'omit' from source: magic vars 30583 1726853684.26340: we have included files to process 30583 1726853684.26341: generating all_blocks data 30583 1726853684.26342: done generating all_blocks data 30583 1726853684.26344: processing included file: fedora.linux_system_roles.network 30583 1726853684.26368: in VariableManager get_vars() 30583 1726853684.26386: done with get_vars() 30583 1726853684.26416: in VariableManager get_vars() 30583 1726853684.26549: done with get_vars() 30583 1726853684.26664: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853684.26986: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853684.27077: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853684.27596: in VariableManager get_vars() 30583 1726853684.27614: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853684.29780: iterating over new_blocks loaded from include file 30583 1726853684.29782: in VariableManager get_vars() 30583 1726853684.29802: done with get_vars() 30583 1726853684.29804: filtering new block on tags 30583 1726853684.30405: done filtering new block on tags 30583 1726853684.30409: in VariableManager get_vars() 30583 1726853684.30423: done with get_vars() 30583 1726853684.30425: filtering new block on tags 30583 1726853684.30440: done filtering new block on tags 30583 1726853684.30442: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853684.30448: extending task lists for all hosts with included blocks 30583 1726853684.30613: done extending task lists 30583 1726853684.30615: done processing included files 30583 1726853684.30616: results queue empty 30583 1726853684.30616: checking for any_errors_fatal 30583 1726853684.30619: done checking for any_errors_fatal 30583 1726853684.30620: checking for max_fail_percentage 30583 1726853684.30621: done checking for max_fail_percentage 30583 1726853684.30622: checking to see if all hosts have failed and the running result is not ok 30583 1726853684.30623: done checking to see if all hosts have failed 30583 1726853684.30623: getting the remaining hosts for this loop 30583 1726853684.30625: done getting the remaining hosts for this loop 30583 1726853684.30627: getting the next task for host managed_node2 30583 1726853684.30631: done getting next task for host managed_node2 30583 1726853684.30634: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853684.30637: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853684.30646: getting variables 30583 1726853684.30647: in VariableManager get_vars() 30583 1726853684.30661: Calling all_inventory to load vars for managed_node2 30583 1726853684.30663: Calling groups_inventory to load vars for managed_node2 30583 1726853684.30665: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853684.30672: Calling all_plugins_play to load vars for managed_node2 30583 1726853684.30675: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853684.30677: Calling groups_plugins_play to load vars for managed_node2 30583 1726853684.31867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853684.33408: done with get_vars() 30583 1726853684.33433: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:34:44 -0400 (0:00:00.160) 0:00:19.672 ****** 30583 1726853684.33519: entering _queue_task() for managed_node2/include_tasks 30583 1726853684.33878: worker is 1 (out of 1 available) 30583 1726853684.33892: exiting _queue_task() for managed_node2/include_tasks 30583 1726853684.33904: done queuing things up, now waiting for results queue to drain 30583 1726853684.33906: waiting for pending results... 30583 1726853684.34203: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853684.34331: in run() - task 02083763-bbaf-05ea-abc5-00000000072f 30583 1726853684.34345: variable 'ansible_search_path' from source: unknown 30583 1726853684.34349: variable 'ansible_search_path' from source: unknown 30583 1726853684.34393: calling self._execute() 30583 1726853684.34680: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853684.34684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853684.34688: variable 'omit' from source: magic vars 30583 1726853684.34878: variable 'ansible_distribution_major_version' from source: facts 30583 1726853684.34893: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853684.34897: _execute() done 30583 1726853684.34900: dumping result to json 30583 1726853684.34902: done dumping result, returning 30583 1726853684.34908: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-00000000072f] 30583 1726853684.34912: sending task result for task 02083763-bbaf-05ea-abc5-00000000072f 30583 1726853684.35011: done sending task result for task 02083763-bbaf-05ea-abc5-00000000072f 30583 1726853684.35014: WORKER PROCESS EXITING 30583 1726853684.35084: no more pending results, returning what we have 30583 1726853684.35090: in VariableManager get_vars() 30583 1726853684.35131: Calling all_inventory to load vars for managed_node2 30583 1726853684.35135: Calling groups_inventory to load vars for managed_node2 30583 1726853684.35137: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853684.35149: Calling all_plugins_play to load vars for managed_node2 30583 1726853684.35152: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853684.35157: Calling groups_plugins_play to load vars for managed_node2 30583 1726853684.36788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853684.38350: done with get_vars() 30583 1726853684.38377: variable 'ansible_search_path' from source: unknown 30583 1726853684.38378: variable 'ansible_search_path' from source: unknown 30583 1726853684.38420: we have included files to process 30583 1726853684.38421: generating all_blocks data 30583 1726853684.38423: done generating all_blocks data 30583 1726853684.38427: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853684.38428: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853684.38430: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853684.39017: done processing included file 30583 1726853684.39020: iterating over new_blocks loaded from include file 30583 1726853684.39021: in VariableManager get_vars() 30583 1726853684.39046: done with get_vars() 30583 1726853684.39048: filtering new block on tags 30583 1726853684.39085: done filtering new block on tags 30583 1726853684.39088: in VariableManager get_vars() 30583 1726853684.39110: done with get_vars() 30583 1726853684.39112: filtering new block on tags 30583 1726853684.39162: done filtering new block on tags 30583 1726853684.39165: in VariableManager get_vars() 30583 1726853684.39187: done with get_vars() 30583 1726853684.39189: filtering new block on tags 30583 1726853684.39225: done filtering new block on tags 30583 1726853684.39227: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853684.39232: extending task lists for all hosts with included blocks 30583 1726853684.40966: done extending task lists 30583 1726853684.40967: done processing included files 30583 1726853684.40968: results queue empty 30583 1726853684.40969: checking for any_errors_fatal 30583 1726853684.40973: done checking for any_errors_fatal 30583 1726853684.40973: checking for max_fail_percentage 30583 1726853684.40975: done checking for max_fail_percentage 30583 1726853684.40976: checking to see if all hosts have failed and the running result is not ok 30583 1726853684.40976: done checking to see if all hosts have failed 30583 1726853684.40977: getting the remaining hosts for this loop 30583 1726853684.40979: done getting the remaining hosts for this loop 30583 1726853684.40981: getting the next task for host managed_node2 30583 1726853684.40986: done getting next task for host managed_node2 30583 1726853684.40989: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853684.40993: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853684.41003: getting variables 30583 1726853684.41004: in VariableManager get_vars() 30583 1726853684.41018: Calling all_inventory to load vars for managed_node2 30583 1726853684.41021: Calling groups_inventory to load vars for managed_node2 30583 1726853684.41023: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853684.41029: Calling all_plugins_play to load vars for managed_node2 30583 1726853684.41032: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853684.41035: Calling groups_plugins_play to load vars for managed_node2 30583 1726853684.42168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853684.43721: done with get_vars() 30583 1726853684.43747: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:34:44 -0400 (0:00:00.103) 0:00:19.775 ****** 30583 1726853684.43834: entering _queue_task() for managed_node2/setup 30583 1726853684.44189: worker is 1 (out of 1 available) 30583 1726853684.44204: exiting _queue_task() for managed_node2/setup 30583 1726853684.44216: done queuing things up, now waiting for results queue to drain 30583 1726853684.44218: waiting for pending results... 30583 1726853684.44500: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853684.44800: in run() - task 02083763-bbaf-05ea-abc5-00000000078c 30583 1726853684.44805: variable 'ansible_search_path' from source: unknown 30583 1726853684.44808: variable 'ansible_search_path' from source: unknown 30583 1726853684.44812: calling self._execute() 30583 1726853684.44827: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853684.44834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853684.44843: variable 'omit' from source: magic vars 30583 1726853684.45235: variable 'ansible_distribution_major_version' from source: facts 30583 1726853684.45244: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853684.45469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853684.47744: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853684.47816: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853684.47852: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853684.47895: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853684.47924: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853684.48004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853684.48032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853684.48057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853684.48105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853684.48120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853684.48177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853684.48202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853684.48227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853684.48302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853684.48305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853684.48438: variable '__network_required_facts' from source: role '' defaults 30583 1726853684.48489: variable 'ansible_facts' from source: unknown 30583 1726853684.49182: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853684.49186: when evaluation is False, skipping this task 30583 1726853684.49193: _execute() done 30583 1726853684.49196: dumping result to json 30583 1726853684.49199: done dumping result, returning 30583 1726853684.49202: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-00000000078c] 30583 1726853684.49207: sending task result for task 02083763-bbaf-05ea-abc5-00000000078c 30583 1726853684.49409: done sending task result for task 02083763-bbaf-05ea-abc5-00000000078c 30583 1726853684.49413: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853684.49459: no more pending results, returning what we have 30583 1726853684.49462: results queue empty 30583 1726853684.49464: checking for any_errors_fatal 30583 1726853684.49465: done checking for any_errors_fatal 30583 1726853684.49466: checking for max_fail_percentage 30583 1726853684.49468: done checking for max_fail_percentage 30583 1726853684.49469: checking to see if all hosts have failed and the running result is not ok 30583 1726853684.49469: done checking to see if all hosts have failed 30583 1726853684.49470: getting the remaining hosts for this loop 30583 1726853684.49474: done getting the remaining hosts for this loop 30583 1726853684.49477: getting the next task for host managed_node2 30583 1726853684.49488: done getting next task for host managed_node2 30583 1726853684.49491: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853684.49497: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853684.49514: getting variables 30583 1726853684.49515: in VariableManager get_vars() 30583 1726853684.49550: Calling all_inventory to load vars for managed_node2 30583 1726853684.49553: Calling groups_inventory to load vars for managed_node2 30583 1726853684.49558: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853684.49568: Calling all_plugins_play to load vars for managed_node2 30583 1726853684.49732: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853684.49743: Calling groups_plugins_play to load vars for managed_node2 30583 1726853684.51172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853684.52647: done with get_vars() 30583 1726853684.52679: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:34:44 -0400 (0:00:00.089) 0:00:19.865 ****** 30583 1726853684.52785: entering _queue_task() for managed_node2/stat 30583 1726853684.53142: worker is 1 (out of 1 available) 30583 1726853684.53157: exiting _queue_task() for managed_node2/stat 30583 1726853684.53170: done queuing things up, now waiting for results queue to drain 30583 1726853684.53375: waiting for pending results... 30583 1726853684.53493: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853684.53659: in run() - task 02083763-bbaf-05ea-abc5-00000000078e 30583 1726853684.53663: variable 'ansible_search_path' from source: unknown 30583 1726853684.53665: variable 'ansible_search_path' from source: unknown 30583 1726853684.53766: calling self._execute() 30583 1726853684.53776: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853684.53783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853684.53793: variable 'omit' from source: magic vars 30583 1726853684.54161: variable 'ansible_distribution_major_version' from source: facts 30583 1726853684.54174: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853684.54333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853684.54848: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853684.54852: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853684.54854: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853684.54856: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853684.54989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853684.54992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853684.54994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853684.54996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853684.55063: variable '__network_is_ostree' from source: set_fact 30583 1726853684.55070: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853684.55076: when evaluation is False, skipping this task 30583 1726853684.55079: _execute() done 30583 1726853684.55081: dumping result to json 30583 1726853684.55083: done dumping result, returning 30583 1726853684.55092: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-00000000078e] 30583 1726853684.55095: sending task result for task 02083763-bbaf-05ea-abc5-00000000078e 30583 1726853684.55192: done sending task result for task 02083763-bbaf-05ea-abc5-00000000078e 30583 1726853684.55196: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853684.55249: no more pending results, returning what we have 30583 1726853684.55253: results queue empty 30583 1726853684.55254: checking for any_errors_fatal 30583 1726853684.55265: done checking for any_errors_fatal 30583 1726853684.55266: checking for max_fail_percentage 30583 1726853684.55268: done checking for max_fail_percentage 30583 1726853684.55269: checking to see if all hosts have failed and the running result is not ok 30583 1726853684.55270: done checking to see if all hosts have failed 30583 1726853684.55272: getting the remaining hosts for this loop 30583 1726853684.55274: done getting the remaining hosts for this loop 30583 1726853684.55278: getting the next task for host managed_node2 30583 1726853684.55287: done getting next task for host managed_node2 30583 1726853684.55290: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853684.55295: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853684.55313: getting variables 30583 1726853684.55315: in VariableManager get_vars() 30583 1726853684.55352: Calling all_inventory to load vars for managed_node2 30583 1726853684.55357: Calling groups_inventory to load vars for managed_node2 30583 1726853684.55360: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853684.55475: Calling all_plugins_play to load vars for managed_node2 30583 1726853684.55480: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853684.55484: Calling groups_plugins_play to load vars for managed_node2 30583 1726853684.57547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853684.59207: done with get_vars() 30583 1726853684.59238: done getting variables 30583 1726853684.59360: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:34:44 -0400 (0:00:00.066) 0:00:19.931 ****** 30583 1726853684.59401: entering _queue_task() for managed_node2/set_fact 30583 1726853684.59744: worker is 1 (out of 1 available) 30583 1726853684.59758: exiting _queue_task() for managed_node2/set_fact 30583 1726853684.59770: done queuing things up, now waiting for results queue to drain 30583 1726853684.59774: waiting for pending results... 30583 1726853684.60290: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853684.60295: in run() - task 02083763-bbaf-05ea-abc5-00000000078f 30583 1726853684.60298: variable 'ansible_search_path' from source: unknown 30583 1726853684.60301: variable 'ansible_search_path' from source: unknown 30583 1726853684.60303: calling self._execute() 30583 1726853684.60359: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853684.60367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853684.60381: variable 'omit' from source: magic vars 30583 1726853684.60876: variable 'ansible_distribution_major_version' from source: facts 30583 1726853684.60880: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853684.60898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853684.61176: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853684.61218: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853684.61254: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853684.61291: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853684.61411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853684.61432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853684.61457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853684.61487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853684.61573: variable '__network_is_ostree' from source: set_fact 30583 1726853684.61676: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853684.61680: when evaluation is False, skipping this task 30583 1726853684.61683: _execute() done 30583 1726853684.61685: dumping result to json 30583 1726853684.61688: done dumping result, returning 30583 1726853684.61691: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-00000000078f] 30583 1726853684.61693: sending task result for task 02083763-bbaf-05ea-abc5-00000000078f 30583 1726853684.61750: done sending task result for task 02083763-bbaf-05ea-abc5-00000000078f 30583 1726853684.61754: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853684.61805: no more pending results, returning what we have 30583 1726853684.61809: results queue empty 30583 1726853684.61810: checking for any_errors_fatal 30583 1726853684.61816: done checking for any_errors_fatal 30583 1726853684.61816: checking for max_fail_percentage 30583 1726853684.61818: done checking for max_fail_percentage 30583 1726853684.61819: checking to see if all hosts have failed and the running result is not ok 30583 1726853684.61820: done checking to see if all hosts have failed 30583 1726853684.61821: getting the remaining hosts for this loop 30583 1726853684.61822: done getting the remaining hosts for this loop 30583 1726853684.61826: getting the next task for host managed_node2 30583 1726853684.61839: done getting next task for host managed_node2 30583 1726853684.61842: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853684.61849: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853684.61870: getting variables 30583 1726853684.61873: in VariableManager get_vars() 30583 1726853684.61913: Calling all_inventory to load vars for managed_node2 30583 1726853684.61916: Calling groups_inventory to load vars for managed_node2 30583 1726853684.61918: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853684.61928: Calling all_plugins_play to load vars for managed_node2 30583 1726853684.61931: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853684.61934: Calling groups_plugins_play to load vars for managed_node2 30583 1726853684.63642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853684.65787: done with get_vars() 30583 1726853684.65814: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:34:44 -0400 (0:00:00.065) 0:00:19.996 ****** 30583 1726853684.65979: entering _queue_task() for managed_node2/service_facts 30583 1726853684.66419: worker is 1 (out of 1 available) 30583 1726853684.66432: exiting _queue_task() for managed_node2/service_facts 30583 1726853684.66443: done queuing things up, now waiting for results queue to drain 30583 1726853684.66444: waiting for pending results... 30583 1726853684.67193: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853684.67478: in run() - task 02083763-bbaf-05ea-abc5-000000000791 30583 1726853684.67483: variable 'ansible_search_path' from source: unknown 30583 1726853684.67486: variable 'ansible_search_path' from source: unknown 30583 1726853684.67489: calling self._execute() 30583 1726853684.67834: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853684.67838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853684.67841: variable 'omit' from source: magic vars 30583 1726853684.68446: variable 'ansible_distribution_major_version' from source: facts 30583 1726853684.68674: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853684.68678: variable 'omit' from source: magic vars 30583 1726853684.68680: variable 'omit' from source: magic vars 30583 1726853684.68806: variable 'omit' from source: magic vars 30583 1726853684.68853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853684.68896: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853684.68998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853684.69022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853684.69045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853684.69250: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853684.69254: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853684.69256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853684.69396: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853684.69410: Set connection var ansible_timeout to 10 30583 1726853684.69418: Set connection var ansible_connection to ssh 30583 1726853684.69429: Set connection var ansible_shell_executable to /bin/sh 30583 1726853684.69439: Set connection var ansible_shell_type to sh 30583 1726853684.69454: Set connection var ansible_pipelining to False 30583 1726853684.69502: variable 'ansible_shell_executable' from source: unknown 30583 1726853684.69678: variable 'ansible_connection' from source: unknown 30583 1726853684.69683: variable 'ansible_module_compression' from source: unknown 30583 1726853684.69686: variable 'ansible_shell_type' from source: unknown 30583 1726853684.69688: variable 'ansible_shell_executable' from source: unknown 30583 1726853684.69690: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853684.69691: variable 'ansible_pipelining' from source: unknown 30583 1726853684.69693: variable 'ansible_timeout' from source: unknown 30583 1726853684.69695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853684.70138: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853684.70142: variable 'omit' from source: magic vars 30583 1726853684.70145: starting attempt loop 30583 1726853684.70147: running the handler 30583 1726853684.70149: _low_level_execute_command(): starting 30583 1726853684.70151: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853684.71558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853684.71775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853684.71898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853684.73642: stdout chunk (state=3): >>>/root <<< 30583 1726853684.73745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853684.73786: stderr chunk (state=3): >>><<< 30583 1726853684.73796: stdout chunk (state=3): >>><<< 30583 1726853684.73928: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853684.73931: _low_level_execute_command(): starting 30583 1726853684.73934: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273 `" && echo ansible-tmp-1726853684.7389488-31537-39128266370273="` echo /root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273 `" ) && sleep 0' 30583 1726853684.75139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853684.75154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853684.75175: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853684.75443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853684.75447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853684.75514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853684.75605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853684.77681: stdout chunk (state=3): >>>ansible-tmp-1726853684.7389488-31537-39128266370273=/root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273 <<< 30583 1726853684.77907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853684.77910: stdout chunk (state=3): >>><<< 30583 1726853684.77918: stderr chunk (state=3): >>><<< 30583 1726853684.77935: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853684.7389488-31537-39128266370273=/root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853684.77994: variable 'ansible_module_compression' from source: unknown 30583 1726853684.78058: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853684.78245: variable 'ansible_facts' from source: unknown 30583 1726853684.78421: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273/AnsiballZ_service_facts.py 30583 1726853684.78812: Sending initial data 30583 1726853684.78815: Sent initial data (161 bytes) 30583 1726853684.79690: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853684.79812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853684.79824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853684.79913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853684.80014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853684.81714: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853684.81810: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853684.81897: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp1dt0b95c /root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273/AnsiballZ_service_facts.py <<< 30583 1726853684.81912: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273/AnsiballZ_service_facts.py" <<< 30583 1726853684.82109: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp1dt0b95c" to remote "/root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273/AnsiballZ_service_facts.py" <<< 30583 1726853684.83276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853684.83280: stdout chunk (state=3): >>><<< 30583 1726853684.83282: stderr chunk (state=3): >>><<< 30583 1726853684.83386: done transferring module to remote 30583 1726853684.83396: _low_level_execute_command(): starting 30583 1726853684.83401: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273/ /root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273/AnsiballZ_service_facts.py && sleep 0' 30583 1726853684.84297: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853684.84306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853684.84327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853684.84342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853684.84363: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853684.84370: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853684.84462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853684.84506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853684.84582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853684.86543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853684.86547: stdout chunk (state=3): >>><<< 30583 1726853684.86549: stderr chunk (state=3): >>><<< 30583 1726853684.86568: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853684.86652: _low_level_execute_command(): starting 30583 1726853684.86658: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273/AnsiballZ_service_facts.py && sleep 0' 30583 1726853684.87193: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853684.87205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853684.87218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853684.87233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853684.87283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853684.87335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853684.87349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853684.87382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853684.87490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853686.51185: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30583 1726853686.51275: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853686.52896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853686.53001: stderr chunk (state=3): >>><<< 30583 1726853686.53004: stdout chunk (state=3): >>><<< 30583 1726853686.53008: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853686.54087: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853686.54176: _low_level_execute_command(): starting 30583 1726853686.54179: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853684.7389488-31537-39128266370273/ > /dev/null 2>&1 && sleep 0' 30583 1726853686.54775: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853686.54790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853686.54803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853686.54834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853686.54850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853686.54862: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853686.54881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853686.54899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853686.54910: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853686.54933: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853686.55020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853686.55055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853686.55175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853686.57225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853686.57238: stdout chunk (state=3): >>><<< 30583 1726853686.57251: stderr chunk (state=3): >>><<< 30583 1726853686.57276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853686.57289: handler run complete 30583 1726853686.57502: variable 'ansible_facts' from source: unknown 30583 1726853686.57659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853686.58163: variable 'ansible_facts' from source: unknown 30583 1726853686.58286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853686.58464: attempt loop complete, returning result 30583 1726853686.58676: _execute() done 30583 1726853686.58679: dumping result to json 30583 1726853686.58681: done dumping result, returning 30583 1726853686.58683: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-000000000791] 30583 1726853686.58686: sending task result for task 02083763-bbaf-05ea-abc5-000000000791 30583 1726853686.59721: done sending task result for task 02083763-bbaf-05ea-abc5-000000000791 30583 1726853686.59725: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853686.59806: no more pending results, returning what we have 30583 1726853686.59809: results queue empty 30583 1726853686.59810: checking for any_errors_fatal 30583 1726853686.59813: done checking for any_errors_fatal 30583 1726853686.59814: checking for max_fail_percentage 30583 1726853686.59816: done checking for max_fail_percentage 30583 1726853686.59816: checking to see if all hosts have failed and the running result is not ok 30583 1726853686.59817: done checking to see if all hosts have failed 30583 1726853686.59818: getting the remaining hosts for this loop 30583 1726853686.59819: done getting the remaining hosts for this loop 30583 1726853686.59822: getting the next task for host managed_node2 30583 1726853686.59828: done getting next task for host managed_node2 30583 1726853686.59832: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853686.59838: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853686.59848: getting variables 30583 1726853686.59849: in VariableManager get_vars() 30583 1726853686.59883: Calling all_inventory to load vars for managed_node2 30583 1726853686.59892: Calling groups_inventory to load vars for managed_node2 30583 1726853686.59901: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853686.59910: Calling all_plugins_play to load vars for managed_node2 30583 1726853686.59913: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853686.59916: Calling groups_plugins_play to load vars for managed_node2 30583 1726853686.61169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853686.62876: done with get_vars() 30583 1726853686.62916: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:34:46 -0400 (0:00:01.970) 0:00:21.967 ****** 30583 1726853686.62994: entering _queue_task() for managed_node2/package_facts 30583 1726853686.63255: worker is 1 (out of 1 available) 30583 1726853686.63269: exiting _queue_task() for managed_node2/package_facts 30583 1726853686.63284: done queuing things up, now waiting for results queue to drain 30583 1726853686.63286: waiting for pending results... 30583 1726853686.63480: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853686.63583: in run() - task 02083763-bbaf-05ea-abc5-000000000792 30583 1726853686.63594: variable 'ansible_search_path' from source: unknown 30583 1726853686.63598: variable 'ansible_search_path' from source: unknown 30583 1726853686.63628: calling self._execute() 30583 1726853686.63703: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853686.63707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853686.63716: variable 'omit' from source: magic vars 30583 1726853686.63995: variable 'ansible_distribution_major_version' from source: facts 30583 1726853686.64004: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853686.64010: variable 'omit' from source: magic vars 30583 1726853686.64065: variable 'omit' from source: magic vars 30583 1726853686.64088: variable 'omit' from source: magic vars 30583 1726853686.64119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853686.64145: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853686.64165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853686.64183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853686.64192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853686.64215: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853686.64218: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853686.64220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853686.64295: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853686.64300: Set connection var ansible_timeout to 10 30583 1726853686.64303: Set connection var ansible_connection to ssh 30583 1726853686.64308: Set connection var ansible_shell_executable to /bin/sh 30583 1726853686.64310: Set connection var ansible_shell_type to sh 30583 1726853686.64318: Set connection var ansible_pipelining to False 30583 1726853686.64336: variable 'ansible_shell_executable' from source: unknown 30583 1726853686.64339: variable 'ansible_connection' from source: unknown 30583 1726853686.64341: variable 'ansible_module_compression' from source: unknown 30583 1726853686.64344: variable 'ansible_shell_type' from source: unknown 30583 1726853686.64346: variable 'ansible_shell_executable' from source: unknown 30583 1726853686.64348: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853686.64350: variable 'ansible_pipelining' from source: unknown 30583 1726853686.64353: variable 'ansible_timeout' from source: unknown 30583 1726853686.64360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853686.64507: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853686.64516: variable 'omit' from source: magic vars 30583 1726853686.64521: starting attempt loop 30583 1726853686.64524: running the handler 30583 1726853686.64535: _low_level_execute_command(): starting 30583 1726853686.64542: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853686.65042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853686.65080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853686.65084: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853686.65087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853686.65135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853686.65138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853686.65140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853686.65222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853686.66952: stdout chunk (state=3): >>>/root <<< 30583 1726853686.67053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853686.67083: stderr chunk (state=3): >>><<< 30583 1726853686.67086: stdout chunk (state=3): >>><<< 30583 1726853686.67107: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853686.67117: _low_level_execute_command(): starting 30583 1726853686.67125: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674 `" && echo ansible-tmp-1726853686.6710572-31590-220880934910674="` echo /root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674 `" ) && sleep 0' 30583 1726853686.67584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853686.67587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853686.67590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853686.67600: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853686.67602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853686.67638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853686.67642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853686.67647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853686.67717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853686.69719: stdout chunk (state=3): >>>ansible-tmp-1726853686.6710572-31590-220880934910674=/root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674 <<< 30583 1726853686.69888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853686.69892: stdout chunk (state=3): >>><<< 30583 1726853686.69894: stderr chunk (state=3): >>><<< 30583 1726853686.70077: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853686.6710572-31590-220880934910674=/root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853686.70080: variable 'ansible_module_compression' from source: unknown 30583 1726853686.70083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853686.70106: variable 'ansible_facts' from source: unknown 30583 1726853686.70323: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674/AnsiballZ_package_facts.py 30583 1726853686.70492: Sending initial data 30583 1726853686.70501: Sent initial data (162 bytes) 30583 1726853686.71192: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853686.71269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853686.71531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853686.71560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853686.71586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853686.71692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853686.73385: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30583 1726853686.73402: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30583 1726853686.73429: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853686.73694: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853686.73809: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp5ilalx3j /root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674/AnsiballZ_package_facts.py <<< 30583 1726853686.73812: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674/AnsiballZ_package_facts.py" <<< 30583 1726853686.73880: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp5ilalx3j" to remote "/root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674/AnsiballZ_package_facts.py" <<< 30583 1726853686.78382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853686.78596: stderr chunk (state=3): >>><<< 30583 1726853686.78601: stdout chunk (state=3): >>><<< 30583 1726853686.78603: done transferring module to remote 30583 1726853686.78605: _low_level_execute_command(): starting 30583 1726853686.78608: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674/ /root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674/AnsiballZ_package_facts.py && sleep 0' 30583 1726853686.79985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853686.80114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853686.80126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853686.80134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853686.80265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853686.82193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853686.82374: stderr chunk (state=3): >>><<< 30583 1726853686.82378: stdout chunk (state=3): >>><<< 30583 1726853686.82577: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853686.82581: _low_level_execute_command(): starting 30583 1726853686.82584: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674/AnsiballZ_package_facts.py && sleep 0' 30583 1726853686.83777: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853686.83782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853686.83784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853686.83802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853686.83804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853686.83807: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853686.83809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853686.83811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853686.83813: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853686.83815: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853686.83816: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853686.83818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853686.83819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853686.83821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853686.83823: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853686.83825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853686.84067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853686.84293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853686.84374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853687.29494: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30583 1726853687.29527: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30583 1726853687.29544: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30583 1726853687.29553: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30583 1726853687.29568: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30583 1726853687.29599: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30583 1726853687.29690: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30583 1726853687.29723: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30583 1726853687.29762: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853687.31611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853687.31622: stdout chunk (state=3): >>><<< 30583 1726853687.31633: stderr chunk (state=3): >>><<< 30583 1726853687.31725: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853687.35319: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853687.35362: _low_level_execute_command(): starting 30583 1726853687.35378: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853686.6710572-31590-220880934910674/ > /dev/null 2>&1 && sleep 0' 30583 1726853687.36044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853687.36063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853687.36082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853687.36111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853687.36188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853687.36479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853687.36500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853687.36629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853687.36708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853687.38829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853687.38833: stdout chunk (state=3): >>><<< 30583 1726853687.38835: stderr chunk (state=3): >>><<< 30583 1726853687.38851: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853687.38866: handler run complete 30583 1726853687.41000: variable 'ansible_facts' from source: unknown 30583 1726853687.42220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853687.46315: variable 'ansible_facts' from source: unknown 30583 1726853687.46865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853687.48334: attempt loop complete, returning result 30583 1726853687.48347: _execute() done 30583 1726853687.48351: dumping result to json 30583 1726853687.48631: done dumping result, returning 30583 1726853687.48648: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-000000000792] 30583 1726853687.48653: sending task result for task 02083763-bbaf-05ea-abc5-000000000792 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853687.52204: no more pending results, returning what we have 30583 1726853687.52207: results queue empty 30583 1726853687.52207: checking for any_errors_fatal 30583 1726853687.52212: done checking for any_errors_fatal 30583 1726853687.52212: checking for max_fail_percentage 30583 1726853687.52214: done checking for max_fail_percentage 30583 1726853687.52215: checking to see if all hosts have failed and the running result is not ok 30583 1726853687.52215: done checking to see if all hosts have failed 30583 1726853687.52216: getting the remaining hosts for this loop 30583 1726853687.52217: done getting the remaining hosts for this loop 30583 1726853687.52220: getting the next task for host managed_node2 30583 1726853687.52227: done getting next task for host managed_node2 30583 1726853687.52230: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853687.52235: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853687.52245: getting variables 30583 1726853687.52246: in VariableManager get_vars() 30583 1726853687.52388: Calling all_inventory to load vars for managed_node2 30583 1726853687.52392: Calling groups_inventory to load vars for managed_node2 30583 1726853687.52395: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853687.52481: Calling all_plugins_play to load vars for managed_node2 30583 1726853687.52484: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853687.52487: Calling groups_plugins_play to load vars for managed_node2 30583 1726853687.53160: done sending task result for task 02083763-bbaf-05ea-abc5-000000000792 30583 1726853687.53166: WORKER PROCESS EXITING 30583 1726853687.55246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853687.58842: done with get_vars() 30583 1726853687.58923: done getting variables 30583 1726853687.59094: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:34:47 -0400 (0:00:00.961) 0:00:22.929 ****** 30583 1726853687.59183: entering _queue_task() for managed_node2/debug 30583 1726853687.59966: worker is 1 (out of 1 available) 30583 1726853687.59980: exiting _queue_task() for managed_node2/debug 30583 1726853687.60080: done queuing things up, now waiting for results queue to drain 30583 1726853687.60082: waiting for pending results... 30583 1726853687.60569: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853687.60882: in run() - task 02083763-bbaf-05ea-abc5-000000000730 30583 1726853687.60898: variable 'ansible_search_path' from source: unknown 30583 1726853687.60902: variable 'ansible_search_path' from source: unknown 30583 1726853687.60942: calling self._execute() 30583 1726853687.61044: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853687.61048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853687.61062: variable 'omit' from source: magic vars 30583 1726853687.62143: variable 'ansible_distribution_major_version' from source: facts 30583 1726853687.62161: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853687.62164: variable 'omit' from source: magic vars 30583 1726853687.62639: variable 'omit' from source: magic vars 30583 1726853687.62745: variable 'network_provider' from source: set_fact 30583 1726853687.62764: variable 'omit' from source: magic vars 30583 1726853687.63047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853687.63084: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853687.63104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853687.63120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853687.63132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853687.63161: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853687.63164: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853687.63166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853687.63479: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853687.63486: Set connection var ansible_timeout to 10 30583 1726853687.63488: Set connection var ansible_connection to ssh 30583 1726853687.63494: Set connection var ansible_shell_executable to /bin/sh 30583 1726853687.63496: Set connection var ansible_shell_type to sh 30583 1726853687.63506: Set connection var ansible_pipelining to False 30583 1726853687.63536: variable 'ansible_shell_executable' from source: unknown 30583 1726853687.63540: variable 'ansible_connection' from source: unknown 30583 1726853687.63542: variable 'ansible_module_compression' from source: unknown 30583 1726853687.63545: variable 'ansible_shell_type' from source: unknown 30583 1726853687.63548: variable 'ansible_shell_executable' from source: unknown 30583 1726853687.63550: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853687.63552: variable 'ansible_pipelining' from source: unknown 30583 1726853687.63589: variable 'ansible_timeout' from source: unknown 30583 1726853687.63604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853687.63821: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853687.63824: variable 'omit' from source: magic vars 30583 1726853687.63826: starting attempt loop 30583 1726853687.63829: running the handler 30583 1726853687.63901: handler run complete 30583 1726853687.63921: attempt loop complete, returning result 30583 1726853687.63931: _execute() done 30583 1726853687.63996: dumping result to json 30583 1726853687.63999: done dumping result, returning 30583 1726853687.64001: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-000000000730] 30583 1726853687.64003: sending task result for task 02083763-bbaf-05ea-abc5-000000000730 ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853687.64213: no more pending results, returning what we have 30583 1726853687.64217: results queue empty 30583 1726853687.64218: checking for any_errors_fatal 30583 1726853687.64230: done checking for any_errors_fatal 30583 1726853687.64231: checking for max_fail_percentage 30583 1726853687.64233: done checking for max_fail_percentage 30583 1726853687.64234: checking to see if all hosts have failed and the running result is not ok 30583 1726853687.64235: done checking to see if all hosts have failed 30583 1726853687.64236: getting the remaining hosts for this loop 30583 1726853687.64238: done getting the remaining hosts for this loop 30583 1726853687.64242: getting the next task for host managed_node2 30583 1726853687.64251: done getting next task for host managed_node2 30583 1726853687.64258: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853687.64263: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853687.64278: getting variables 30583 1726853687.64280: in VariableManager get_vars() 30583 1726853687.64317: Calling all_inventory to load vars for managed_node2 30583 1726853687.64321: Calling groups_inventory to load vars for managed_node2 30583 1726853687.64323: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853687.64333: Calling all_plugins_play to load vars for managed_node2 30583 1726853687.64336: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853687.64339: Calling groups_plugins_play to load vars for managed_node2 30583 1726853687.64884: done sending task result for task 02083763-bbaf-05ea-abc5-000000000730 30583 1726853687.64887: WORKER PROCESS EXITING 30583 1726853687.66851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853687.75011: done with get_vars() 30583 1726853687.75048: done getting variables 30583 1726853687.75119: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:34:47 -0400 (0:00:00.159) 0:00:23.088 ****** 30583 1726853687.75160: entering _queue_task() for managed_node2/fail 30583 1726853687.75569: worker is 1 (out of 1 available) 30583 1726853687.75587: exiting _queue_task() for managed_node2/fail 30583 1726853687.75601: done queuing things up, now waiting for results queue to drain 30583 1726853687.75603: waiting for pending results... 30583 1726853687.75935: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853687.76083: in run() - task 02083763-bbaf-05ea-abc5-000000000731 30583 1726853687.76272: variable 'ansible_search_path' from source: unknown 30583 1726853687.76278: variable 'ansible_search_path' from source: unknown 30583 1726853687.76283: calling self._execute() 30583 1726853687.76285: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853687.76288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853687.76291: variable 'omit' from source: magic vars 30583 1726853687.76627: variable 'ansible_distribution_major_version' from source: facts 30583 1726853687.76638: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853687.76757: variable 'network_state' from source: role '' defaults 30583 1726853687.76764: Evaluated conditional (network_state != {}): False 30583 1726853687.76769: when evaluation is False, skipping this task 30583 1726853687.76774: _execute() done 30583 1726853687.76777: dumping result to json 30583 1726853687.76779: done dumping result, returning 30583 1726853687.76788: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-000000000731] 30583 1726853687.76792: sending task result for task 02083763-bbaf-05ea-abc5-000000000731 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853687.76999: no more pending results, returning what we have 30583 1726853687.77003: results queue empty 30583 1726853687.77004: checking for any_errors_fatal 30583 1726853687.77011: done checking for any_errors_fatal 30583 1726853687.77012: checking for max_fail_percentage 30583 1726853687.77014: done checking for max_fail_percentage 30583 1726853687.77015: checking to see if all hosts have failed and the running result is not ok 30583 1726853687.77015: done checking to see if all hosts have failed 30583 1726853687.77016: getting the remaining hosts for this loop 30583 1726853687.77018: done getting the remaining hosts for this loop 30583 1726853687.77022: getting the next task for host managed_node2 30583 1726853687.77029: done getting next task for host managed_node2 30583 1726853687.77033: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853687.77038: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853687.77059: getting variables 30583 1726853687.77060: in VariableManager get_vars() 30583 1726853687.77183: Calling all_inventory to load vars for managed_node2 30583 1726853687.77187: Calling groups_inventory to load vars for managed_node2 30583 1726853687.77189: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853687.77199: Calling all_plugins_play to load vars for managed_node2 30583 1726853687.77201: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853687.77205: Calling groups_plugins_play to load vars for managed_node2 30583 1726853687.77882: done sending task result for task 02083763-bbaf-05ea-abc5-000000000731 30583 1726853687.77886: WORKER PROCESS EXITING 30583 1726853687.79393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853687.80957: done with get_vars() 30583 1726853687.80989: done getting variables 30583 1726853687.81054: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:34:47 -0400 (0:00:00.059) 0:00:23.148 ****** 30583 1726853687.81125: entering _queue_task() for managed_node2/fail 30583 1726853687.81750: worker is 1 (out of 1 available) 30583 1726853687.81763: exiting _queue_task() for managed_node2/fail 30583 1726853687.81979: done queuing things up, now waiting for results queue to drain 30583 1726853687.81981: waiting for pending results... 30583 1726853687.82086: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853687.82326: in run() - task 02083763-bbaf-05ea-abc5-000000000732 30583 1726853687.82340: variable 'ansible_search_path' from source: unknown 30583 1726853687.82343: variable 'ansible_search_path' from source: unknown 30583 1726853687.82536: calling self._execute() 30583 1726853687.82720: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853687.82723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853687.82725: variable 'omit' from source: magic vars 30583 1726853687.83361: variable 'ansible_distribution_major_version' from source: facts 30583 1726853687.83369: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853687.83489: variable 'network_state' from source: role '' defaults 30583 1726853687.83500: Evaluated conditional (network_state != {}): False 30583 1726853687.83503: when evaluation is False, skipping this task 30583 1726853687.83506: _execute() done 30583 1726853687.83508: dumping result to json 30583 1726853687.83511: done dumping result, returning 30583 1726853687.83520: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-000000000732] 30583 1726853687.83524: sending task result for task 02083763-bbaf-05ea-abc5-000000000732 30583 1726853687.83624: done sending task result for task 02083763-bbaf-05ea-abc5-000000000732 30583 1726853687.83626: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853687.83707: no more pending results, returning what we have 30583 1726853687.83712: results queue empty 30583 1726853687.83713: checking for any_errors_fatal 30583 1726853687.83720: done checking for any_errors_fatal 30583 1726853687.83721: checking for max_fail_percentage 30583 1726853687.83723: done checking for max_fail_percentage 30583 1726853687.83724: checking to see if all hosts have failed and the running result is not ok 30583 1726853687.83724: done checking to see if all hosts have failed 30583 1726853687.83725: getting the remaining hosts for this loop 30583 1726853687.83727: done getting the remaining hosts for this loop 30583 1726853687.83732: getting the next task for host managed_node2 30583 1726853687.83740: done getting next task for host managed_node2 30583 1726853687.83745: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853687.83751: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853687.83773: getting variables 30583 1726853687.83775: in VariableManager get_vars() 30583 1726853687.83814: Calling all_inventory to load vars for managed_node2 30583 1726853687.83816: Calling groups_inventory to load vars for managed_node2 30583 1726853687.83819: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853687.83831: Calling all_plugins_play to load vars for managed_node2 30583 1726853687.83835: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853687.83838: Calling groups_plugins_play to load vars for managed_node2 30583 1726853687.85477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853687.86982: done with get_vars() 30583 1726853687.87010: done getting variables 30583 1726853687.87078: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:34:47 -0400 (0:00:00.059) 0:00:23.208 ****** 30583 1726853687.87117: entering _queue_task() for managed_node2/fail 30583 1726853687.87484: worker is 1 (out of 1 available) 30583 1726853687.87497: exiting _queue_task() for managed_node2/fail 30583 1726853687.87509: done queuing things up, now waiting for results queue to drain 30583 1726853687.87511: waiting for pending results... 30583 1726853687.87822: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853687.88080: in run() - task 02083763-bbaf-05ea-abc5-000000000733 30583 1726853687.88085: variable 'ansible_search_path' from source: unknown 30583 1726853687.88088: variable 'ansible_search_path' from source: unknown 30583 1726853687.88098: calling self._execute() 30583 1726853687.88209: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853687.88214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853687.88218: variable 'omit' from source: magic vars 30583 1726853687.88666: variable 'ansible_distribution_major_version' from source: facts 30583 1726853687.88672: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853687.88776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853687.93136: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853687.93270: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853687.93315: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853687.93361: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853687.93385: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853687.93618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853687.93622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853687.93625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853687.93627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853687.93630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853687.93658: variable 'ansible_distribution_major_version' from source: facts 30583 1726853687.93674: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853687.93784: variable 'ansible_distribution' from source: facts 30583 1726853687.93788: variable '__network_rh_distros' from source: role '' defaults 30583 1726853687.93797: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853687.94377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853687.94381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853687.94384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853687.94386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853687.94389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853687.94391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853687.94393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853687.94395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853687.94397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853687.94481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853687.94512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853687.94535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853687.94558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853687.94708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853687.94722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853687.95345: variable 'network_connections' from source: include params 30583 1726853687.95358: variable 'interface' from source: play vars 30583 1726853687.95429: variable 'interface' from source: play vars 30583 1726853687.95558: variable 'network_state' from source: role '' defaults 30583 1726853687.95615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853687.96042: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853687.96175: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853687.96327: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853687.96352: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853687.96399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853687.96478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853687.96549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853687.97064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853687.97068: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853687.97072: when evaluation is False, skipping this task 30583 1726853687.97075: _execute() done 30583 1726853687.97077: dumping result to json 30583 1726853687.97080: done dumping result, returning 30583 1726853687.97085: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-000000000733] 30583 1726853687.97088: sending task result for task 02083763-bbaf-05ea-abc5-000000000733 30583 1726853687.97152: done sending task result for task 02083763-bbaf-05ea-abc5-000000000733 30583 1726853687.97157: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853687.97240: no more pending results, returning what we have 30583 1726853687.97244: results queue empty 30583 1726853687.97245: checking for any_errors_fatal 30583 1726853687.97252: done checking for any_errors_fatal 30583 1726853687.97253: checking for max_fail_percentage 30583 1726853687.97255: done checking for max_fail_percentage 30583 1726853687.97256: checking to see if all hosts have failed and the running result is not ok 30583 1726853687.97257: done checking to see if all hosts have failed 30583 1726853687.97257: getting the remaining hosts for this loop 30583 1726853687.97260: done getting the remaining hosts for this loop 30583 1726853687.97264: getting the next task for host managed_node2 30583 1726853687.97275: done getting next task for host managed_node2 30583 1726853687.97280: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853687.97284: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853687.97304: getting variables 30583 1726853687.97307: in VariableManager get_vars() 30583 1726853687.97346: Calling all_inventory to load vars for managed_node2 30583 1726853687.97349: Calling groups_inventory to load vars for managed_node2 30583 1726853687.97351: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853687.97361: Calling all_plugins_play to load vars for managed_node2 30583 1726853687.97364: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853687.97368: Calling groups_plugins_play to load vars for managed_node2 30583 1726853687.98978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853688.00568: done with get_vars() 30583 1726853688.00596: done getting variables 30583 1726853688.00655: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:34:48 -0400 (0:00:00.137) 0:00:23.346 ****** 30583 1726853688.00896: entering _queue_task() for managed_node2/dnf 30583 1726853688.01448: worker is 1 (out of 1 available) 30583 1726853688.01461: exiting _queue_task() for managed_node2/dnf 30583 1726853688.01677: done queuing things up, now waiting for results queue to drain 30583 1726853688.01679: waiting for pending results... 30583 1726853688.01932: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853688.02105: in run() - task 02083763-bbaf-05ea-abc5-000000000734 30583 1726853688.02115: variable 'ansible_search_path' from source: unknown 30583 1726853688.02124: variable 'ansible_search_path' from source: unknown 30583 1726853688.02164: calling self._execute() 30583 1726853688.02377: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853688.02381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853688.02384: variable 'omit' from source: magic vars 30583 1726853688.02710: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.02722: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853688.02932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853688.05691: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853688.05877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853688.05881: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853688.05884: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853688.05887: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853688.06029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.06033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.06035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.06070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.06089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.06355: variable 'ansible_distribution' from source: facts 30583 1726853688.06359: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.06362: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853688.06379: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853688.06512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.06552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.06583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.06625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.06637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.06680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.06721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.06746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.06794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.06808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.06842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.06868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.06896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.06931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.06945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.07178: variable 'network_connections' from source: include params 30583 1726853688.07182: variable 'interface' from source: play vars 30583 1726853688.07205: variable 'interface' from source: play vars 30583 1726853688.07289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853688.07474: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853688.07509: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853688.07542: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853688.07599: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853688.07723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853688.07727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853688.07738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.07740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853688.07768: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853688.08081: variable 'network_connections' from source: include params 30583 1726853688.08084: variable 'interface' from source: play vars 30583 1726853688.08187: variable 'interface' from source: play vars 30583 1726853688.08190: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853688.08192: when evaluation is False, skipping this task 30583 1726853688.08194: _execute() done 30583 1726853688.08196: dumping result to json 30583 1726853688.08197: done dumping result, returning 30583 1726853688.08199: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000734] 30583 1726853688.08201: sending task result for task 02083763-bbaf-05ea-abc5-000000000734 30583 1726853688.08281: done sending task result for task 02083763-bbaf-05ea-abc5-000000000734 30583 1726853688.08285: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853688.08336: no more pending results, returning what we have 30583 1726853688.08340: results queue empty 30583 1726853688.08342: checking for any_errors_fatal 30583 1726853688.08347: done checking for any_errors_fatal 30583 1726853688.08348: checking for max_fail_percentage 30583 1726853688.08350: done checking for max_fail_percentage 30583 1726853688.08351: checking to see if all hosts have failed and the running result is not ok 30583 1726853688.08352: done checking to see if all hosts have failed 30583 1726853688.08353: getting the remaining hosts for this loop 30583 1726853688.08357: done getting the remaining hosts for this loop 30583 1726853688.08362: getting the next task for host managed_node2 30583 1726853688.08372: done getting next task for host managed_node2 30583 1726853688.08489: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853688.08494: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853688.08512: getting variables 30583 1726853688.08514: in VariableManager get_vars() 30583 1726853688.08553: Calling all_inventory to load vars for managed_node2 30583 1726853688.08559: Calling groups_inventory to load vars for managed_node2 30583 1726853688.08562: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853688.08858: Calling all_plugins_play to load vars for managed_node2 30583 1726853688.08862: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853688.08866: Calling groups_plugins_play to load vars for managed_node2 30583 1726853688.11234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853688.12826: done with get_vars() 30583 1726853688.12853: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853688.12932: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:34:48 -0400 (0:00:00.120) 0:00:23.466 ****** 30583 1726853688.12966: entering _queue_task() for managed_node2/yum 30583 1726853688.13328: worker is 1 (out of 1 available) 30583 1726853688.13340: exiting _queue_task() for managed_node2/yum 30583 1726853688.13354: done queuing things up, now waiting for results queue to drain 30583 1726853688.13355: waiting for pending results... 30583 1726853688.13726: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853688.13825: in run() - task 02083763-bbaf-05ea-abc5-000000000735 30583 1726853688.13830: variable 'ansible_search_path' from source: unknown 30583 1726853688.13835: variable 'ansible_search_path' from source: unknown 30583 1726853688.13926: calling self._execute() 30583 1726853688.13944: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853688.13953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853688.13960: variable 'omit' from source: magic vars 30583 1726853688.14357: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.14361: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853688.15077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853688.16745: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853688.16826: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853688.16862: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853688.16897: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853688.16921: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853688.17002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.17028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.17061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.17099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.17112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.17258: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.17262: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853688.17264: when evaluation is False, skipping this task 30583 1726853688.17268: _execute() done 30583 1726853688.17270: dumping result to json 30583 1726853688.17274: done dumping result, returning 30583 1726853688.17277: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000735] 30583 1726853688.17279: sending task result for task 02083763-bbaf-05ea-abc5-000000000735 30583 1726853688.17346: done sending task result for task 02083763-bbaf-05ea-abc5-000000000735 30583 1726853688.17348: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853688.17413: no more pending results, returning what we have 30583 1726853688.17417: results queue empty 30583 1726853688.17418: checking for any_errors_fatal 30583 1726853688.17423: done checking for any_errors_fatal 30583 1726853688.17424: checking for max_fail_percentage 30583 1726853688.17427: done checking for max_fail_percentage 30583 1726853688.17428: checking to see if all hosts have failed and the running result is not ok 30583 1726853688.17429: done checking to see if all hosts have failed 30583 1726853688.17429: getting the remaining hosts for this loop 30583 1726853688.17432: done getting the remaining hosts for this loop 30583 1726853688.17436: getting the next task for host managed_node2 30583 1726853688.17445: done getting next task for host managed_node2 30583 1726853688.17449: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853688.17454: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853688.17474: getting variables 30583 1726853688.17477: in VariableManager get_vars() 30583 1726853688.17515: Calling all_inventory to load vars for managed_node2 30583 1726853688.17518: Calling groups_inventory to load vars for managed_node2 30583 1726853688.17520: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853688.17531: Calling all_plugins_play to load vars for managed_node2 30583 1726853688.17534: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853688.17537: Calling groups_plugins_play to load vars for managed_node2 30583 1726853688.19074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853688.20775: done with get_vars() 30583 1726853688.20797: done getting variables 30583 1726853688.20855: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:34:48 -0400 (0:00:00.079) 0:00:23.546 ****** 30583 1726853688.20894: entering _queue_task() for managed_node2/fail 30583 1726853688.21241: worker is 1 (out of 1 available) 30583 1726853688.21253: exiting _queue_task() for managed_node2/fail 30583 1726853688.21265: done queuing things up, now waiting for results queue to drain 30583 1726853688.21267: waiting for pending results... 30583 1726853688.21692: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853688.21699: in run() - task 02083763-bbaf-05ea-abc5-000000000736 30583 1726853688.21712: variable 'ansible_search_path' from source: unknown 30583 1726853688.21716: variable 'ansible_search_path' from source: unknown 30583 1726853688.21750: calling self._execute() 30583 1726853688.22004: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853688.22008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853688.22012: variable 'omit' from source: magic vars 30583 1726853688.22236: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.22247: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853688.22366: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853688.22562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853688.26545: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853688.26689: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853688.26784: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853688.26818: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853688.26959: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853688.27035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.27063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.27187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.27230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.27245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.27397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.27425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.27450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.27491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.27506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.27667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.27750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.27777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.27889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.27904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.28316: variable 'network_connections' from source: include params 30583 1726853688.28326: variable 'interface' from source: play vars 30583 1726853688.28506: variable 'interface' from source: play vars 30583 1726853688.28576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853688.28964: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853688.29016: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853688.29161: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853688.29191: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853688.29233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853688.29253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853688.29401: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.29423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853688.29485: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853688.30110: variable 'network_connections' from source: include params 30583 1726853688.30113: variable 'interface' from source: play vars 30583 1726853688.30475: variable 'interface' from source: play vars 30583 1726853688.30479: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853688.30482: when evaluation is False, skipping this task 30583 1726853688.30485: _execute() done 30583 1726853688.30487: dumping result to json 30583 1726853688.30489: done dumping result, returning 30583 1726853688.30491: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000736] 30583 1726853688.30492: sending task result for task 02083763-bbaf-05ea-abc5-000000000736 30583 1726853688.30564: done sending task result for task 02083763-bbaf-05ea-abc5-000000000736 30583 1726853688.30566: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853688.30623: no more pending results, returning what we have 30583 1726853688.30626: results queue empty 30583 1726853688.30627: checking for any_errors_fatal 30583 1726853688.30635: done checking for any_errors_fatal 30583 1726853688.30636: checking for max_fail_percentage 30583 1726853688.30638: done checking for max_fail_percentage 30583 1726853688.30639: checking to see if all hosts have failed and the running result is not ok 30583 1726853688.30640: done checking to see if all hosts have failed 30583 1726853688.30641: getting the remaining hosts for this loop 30583 1726853688.30643: done getting the remaining hosts for this loop 30583 1726853688.30646: getting the next task for host managed_node2 30583 1726853688.30654: done getting next task for host managed_node2 30583 1726853688.30659: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853688.30663: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853688.30685: getting variables 30583 1726853688.30687: in VariableManager get_vars() 30583 1726853688.30726: Calling all_inventory to load vars for managed_node2 30583 1726853688.30729: Calling groups_inventory to load vars for managed_node2 30583 1726853688.30731: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853688.30742: Calling all_plugins_play to load vars for managed_node2 30583 1726853688.30745: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853688.30747: Calling groups_plugins_play to load vars for managed_node2 30583 1726853688.33763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853688.35474: done with get_vars() 30583 1726853688.35545: done getting variables 30583 1726853688.35646: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:34:48 -0400 (0:00:00.148) 0:00:23.694 ****** 30583 1726853688.35741: entering _queue_task() for managed_node2/package 30583 1726853688.36507: worker is 1 (out of 1 available) 30583 1726853688.36522: exiting _queue_task() for managed_node2/package 30583 1726853688.36535: done queuing things up, now waiting for results queue to drain 30583 1726853688.36537: waiting for pending results... 30583 1726853688.37243: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853688.37312: in run() - task 02083763-bbaf-05ea-abc5-000000000737 30583 1726853688.37340: variable 'ansible_search_path' from source: unknown 30583 1726853688.37350: variable 'ansible_search_path' from source: unknown 30583 1726853688.37404: calling self._execute() 30583 1726853688.37523: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853688.37533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853688.37547: variable 'omit' from source: magic vars 30583 1726853688.37965: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.37988: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853688.38200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853688.38499: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853688.38551: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853688.38597: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853688.38711: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853688.38864: variable 'network_packages' from source: role '' defaults 30583 1726853688.38996: variable '__network_provider_setup' from source: role '' defaults 30583 1726853688.39027: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853688.39088: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853688.39103: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853688.39244: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853688.39375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853688.42052: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853688.42315: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853688.42345: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853688.42388: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853688.42497: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853688.42966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.43003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.43036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.43082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.43102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.43153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.43185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.43214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.43259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.43310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.43691: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853688.43812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.43843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.43876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.43941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.43944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.44047: variable 'ansible_python' from source: facts 30583 1726853688.44276: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853688.44280: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853688.44282: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853688.44384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.44422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.44453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.44504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.44623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.44626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.44640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.44659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.44705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.44729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.44898: variable 'network_connections' from source: include params 30583 1726853688.44909: variable 'interface' from source: play vars 30583 1726853688.45021: variable 'interface' from source: play vars 30583 1726853688.45115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853688.45147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853688.45194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.45232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853688.45299: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853688.45610: variable 'network_connections' from source: include params 30583 1726853688.45677: variable 'interface' from source: play vars 30583 1726853688.45739: variable 'interface' from source: play vars 30583 1726853688.45803: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853688.45896: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853688.46205: variable 'network_connections' from source: include params 30583 1726853688.46215: variable 'interface' from source: play vars 30583 1726853688.46331: variable 'interface' from source: play vars 30583 1726853688.46364: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853688.46624: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853688.47098: variable 'network_connections' from source: include params 30583 1726853688.47180: variable 'interface' from source: play vars 30583 1726853688.47246: variable 'interface' from source: play vars 30583 1726853688.47451: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853688.47714: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853688.47717: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853688.47720: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853688.48135: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853688.49125: variable 'network_connections' from source: include params 30583 1726853688.49139: variable 'interface' from source: play vars 30583 1726853688.49244: variable 'interface' from source: play vars 30583 1726853688.49247: variable 'ansible_distribution' from source: facts 30583 1726853688.49249: variable '__network_rh_distros' from source: role '' defaults 30583 1726853688.49251: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.49275: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853688.49446: variable 'ansible_distribution' from source: facts 30583 1726853688.49466: variable '__network_rh_distros' from source: role '' defaults 30583 1726853688.49484: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.49574: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853688.49670: variable 'ansible_distribution' from source: facts 30583 1726853688.49688: variable '__network_rh_distros' from source: role '' defaults 30583 1726853688.49697: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.49734: variable 'network_provider' from source: set_fact 30583 1726853688.49752: variable 'ansible_facts' from source: unknown 30583 1726853688.50484: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853688.50492: when evaluation is False, skipping this task 30583 1726853688.50499: _execute() done 30583 1726853688.50504: dumping result to json 30583 1726853688.50511: done dumping result, returning 30583 1726853688.50522: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-000000000737] 30583 1726853688.50531: sending task result for task 02083763-bbaf-05ea-abc5-000000000737 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853688.50705: no more pending results, returning what we have 30583 1726853688.50709: results queue empty 30583 1726853688.50711: checking for any_errors_fatal 30583 1726853688.50717: done checking for any_errors_fatal 30583 1726853688.50718: checking for max_fail_percentage 30583 1726853688.50721: done checking for max_fail_percentage 30583 1726853688.50722: checking to see if all hosts have failed and the running result is not ok 30583 1726853688.50722: done checking to see if all hosts have failed 30583 1726853688.50723: getting the remaining hosts for this loop 30583 1726853688.50725: done getting the remaining hosts for this loop 30583 1726853688.50729: getting the next task for host managed_node2 30583 1726853688.50739: done getting next task for host managed_node2 30583 1726853688.50743: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853688.50749: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853688.50877: getting variables 30583 1726853688.50879: in VariableManager get_vars() 30583 1726853688.50922: Calling all_inventory to load vars for managed_node2 30583 1726853688.50925: Calling groups_inventory to load vars for managed_node2 30583 1726853688.50933: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853688.50946: Calling all_plugins_play to load vars for managed_node2 30583 1726853688.50949: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853688.50952: Calling groups_plugins_play to load vars for managed_node2 30583 1726853688.51529: done sending task result for task 02083763-bbaf-05ea-abc5-000000000737 30583 1726853688.51533: WORKER PROCESS EXITING 30583 1726853688.52972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853688.54570: done with get_vars() 30583 1726853688.54596: done getting variables 30583 1726853688.54650: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:34:48 -0400 (0:00:00.189) 0:00:23.884 ****** 30583 1726853688.54692: entering _queue_task() for managed_node2/package 30583 1726853688.55193: worker is 1 (out of 1 available) 30583 1726853688.55205: exiting _queue_task() for managed_node2/package 30583 1726853688.55217: done queuing things up, now waiting for results queue to drain 30583 1726853688.55218: waiting for pending results... 30583 1726853688.55366: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853688.55528: in run() - task 02083763-bbaf-05ea-abc5-000000000738 30583 1726853688.55553: variable 'ansible_search_path' from source: unknown 30583 1726853688.55566: variable 'ansible_search_path' from source: unknown 30583 1726853688.55606: calling self._execute() 30583 1726853688.55712: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853688.55724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853688.55740: variable 'omit' from source: magic vars 30583 1726853688.56137: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.56153: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853688.56287: variable 'network_state' from source: role '' defaults 30583 1726853688.56303: Evaluated conditional (network_state != {}): False 30583 1726853688.56320: when evaluation is False, skipping this task 30583 1726853688.56328: _execute() done 30583 1726853688.56422: dumping result to json 30583 1726853688.56426: done dumping result, returning 30583 1726853688.56429: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000000738] 30583 1726853688.56432: sending task result for task 02083763-bbaf-05ea-abc5-000000000738 30583 1726853688.56507: done sending task result for task 02083763-bbaf-05ea-abc5-000000000738 30583 1726853688.56510: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853688.56578: no more pending results, returning what we have 30583 1726853688.56582: results queue empty 30583 1726853688.56584: checking for any_errors_fatal 30583 1726853688.56589: done checking for any_errors_fatal 30583 1726853688.56590: checking for max_fail_percentage 30583 1726853688.56592: done checking for max_fail_percentage 30583 1726853688.56593: checking to see if all hosts have failed and the running result is not ok 30583 1726853688.56594: done checking to see if all hosts have failed 30583 1726853688.56594: getting the remaining hosts for this loop 30583 1726853688.56596: done getting the remaining hosts for this loop 30583 1726853688.56600: getting the next task for host managed_node2 30583 1726853688.56609: done getting next task for host managed_node2 30583 1726853688.56614: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853688.56619: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853688.56646: getting variables 30583 1726853688.56648: in VariableManager get_vars() 30583 1726853688.56691: Calling all_inventory to load vars for managed_node2 30583 1726853688.56694: Calling groups_inventory to load vars for managed_node2 30583 1726853688.56696: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853688.56708: Calling all_plugins_play to load vars for managed_node2 30583 1726853688.56712: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853688.56716: Calling groups_plugins_play to load vars for managed_node2 30583 1726853688.58315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853688.59965: done with get_vars() 30583 1726853688.59996: done getting variables 30583 1726853688.60078: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:34:48 -0400 (0:00:00.054) 0:00:23.938 ****** 30583 1726853688.60117: entering _queue_task() for managed_node2/package 30583 1726853688.60470: worker is 1 (out of 1 available) 30583 1726853688.60486: exiting _queue_task() for managed_node2/package 30583 1726853688.60498: done queuing things up, now waiting for results queue to drain 30583 1726853688.60499: waiting for pending results... 30583 1726853688.60888: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853688.60967: in run() - task 02083763-bbaf-05ea-abc5-000000000739 30583 1726853688.60987: variable 'ansible_search_path' from source: unknown 30583 1726853688.60996: variable 'ansible_search_path' from source: unknown 30583 1726853688.61044: calling self._execute() 30583 1726853688.61146: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853688.61160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853688.61177: variable 'omit' from source: magic vars 30583 1726853688.61568: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.61585: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853688.61699: variable 'network_state' from source: role '' defaults 30583 1726853688.61770: Evaluated conditional (network_state != {}): False 30583 1726853688.61775: when evaluation is False, skipping this task 30583 1726853688.61779: _execute() done 30583 1726853688.61782: dumping result to json 30583 1726853688.61784: done dumping result, returning 30583 1726853688.61787: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000000739] 30583 1726853688.61789: sending task result for task 02083763-bbaf-05ea-abc5-000000000739 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853688.62045: no more pending results, returning what we have 30583 1726853688.62050: results queue empty 30583 1726853688.62051: checking for any_errors_fatal 30583 1726853688.62063: done checking for any_errors_fatal 30583 1726853688.62064: checking for max_fail_percentage 30583 1726853688.62066: done checking for max_fail_percentage 30583 1726853688.62067: checking to see if all hosts have failed and the running result is not ok 30583 1726853688.62068: done checking to see if all hosts have failed 30583 1726853688.62069: getting the remaining hosts for this loop 30583 1726853688.62074: done getting the remaining hosts for this loop 30583 1726853688.62079: getting the next task for host managed_node2 30583 1726853688.62092: done getting next task for host managed_node2 30583 1726853688.62097: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853688.62103: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853688.62189: getting variables 30583 1726853688.62191: in VariableManager get_vars() 30583 1726853688.62237: Calling all_inventory to load vars for managed_node2 30583 1726853688.62240: Calling groups_inventory to load vars for managed_node2 30583 1726853688.62243: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853688.62321: Calling all_plugins_play to load vars for managed_node2 30583 1726853688.62325: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853688.62328: Calling groups_plugins_play to load vars for managed_node2 30583 1726853688.62931: done sending task result for task 02083763-bbaf-05ea-abc5-000000000739 30583 1726853688.62934: WORKER PROCESS EXITING 30583 1726853688.63984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853688.65580: done with get_vars() 30583 1726853688.65604: done getting variables 30583 1726853688.65658: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:34:48 -0400 (0:00:00.055) 0:00:23.994 ****** 30583 1726853688.65701: entering _queue_task() for managed_node2/service 30583 1726853688.66066: worker is 1 (out of 1 available) 30583 1726853688.66284: exiting _queue_task() for managed_node2/service 30583 1726853688.66295: done queuing things up, now waiting for results queue to drain 30583 1726853688.66296: waiting for pending results... 30583 1726853688.66412: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853688.66558: in run() - task 02083763-bbaf-05ea-abc5-00000000073a 30583 1726853688.66581: variable 'ansible_search_path' from source: unknown 30583 1726853688.66590: variable 'ansible_search_path' from source: unknown 30583 1726853688.66640: calling self._execute() 30583 1726853688.66753: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853688.66768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853688.66786: variable 'omit' from source: magic vars 30583 1726853688.67260: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.67263: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853688.67340: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853688.67565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853688.69958: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853688.70049: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853688.70103: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853688.70144: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853688.70187: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853688.70294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.70321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.70402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.70405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.70422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.70481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.70539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.70544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.70592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.70615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.70727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.70730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.70733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.70774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.70794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.71001: variable 'network_connections' from source: include params 30583 1726853688.71018: variable 'interface' from source: play vars 30583 1726853688.71103: variable 'interface' from source: play vars 30583 1726853688.71192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853688.71370: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853688.71434: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853688.71486: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853688.71514: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853688.71595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853688.71598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853688.71627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.71661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853688.71735: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853688.71997: variable 'network_connections' from source: include params 30583 1726853688.72029: variable 'interface' from source: play vars 30583 1726853688.72085: variable 'interface' from source: play vars 30583 1726853688.72124: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853688.72137: when evaluation is False, skipping this task 30583 1726853688.72162: _execute() done 30583 1726853688.72165: dumping result to json 30583 1726853688.72167: done dumping result, returning 30583 1726853688.72276: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-00000000073a] 30583 1726853688.72280: sending task result for task 02083763-bbaf-05ea-abc5-00000000073a 30583 1726853688.72349: done sending task result for task 02083763-bbaf-05ea-abc5-00000000073a 30583 1726853688.72362: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853688.72415: no more pending results, returning what we have 30583 1726853688.72419: results queue empty 30583 1726853688.72420: checking for any_errors_fatal 30583 1726853688.72426: done checking for any_errors_fatal 30583 1726853688.72426: checking for max_fail_percentage 30583 1726853688.72429: done checking for max_fail_percentage 30583 1726853688.72430: checking to see if all hosts have failed and the running result is not ok 30583 1726853688.72430: done checking to see if all hosts have failed 30583 1726853688.72431: getting the remaining hosts for this loop 30583 1726853688.72433: done getting the remaining hosts for this loop 30583 1726853688.72438: getting the next task for host managed_node2 30583 1726853688.72447: done getting next task for host managed_node2 30583 1726853688.72451: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853688.72460: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853688.72586: getting variables 30583 1726853688.72588: in VariableManager get_vars() 30583 1726853688.72627: Calling all_inventory to load vars for managed_node2 30583 1726853688.72631: Calling groups_inventory to load vars for managed_node2 30583 1726853688.72634: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853688.72646: Calling all_plugins_play to load vars for managed_node2 30583 1726853688.72649: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853688.72653: Calling groups_plugins_play to load vars for managed_node2 30583 1726853688.74209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853688.75889: done with get_vars() 30583 1726853688.75919: done getting variables 30583 1726853688.75983: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:34:48 -0400 (0:00:00.103) 0:00:24.097 ****** 30583 1726853688.76017: entering _queue_task() for managed_node2/service 30583 1726853688.76600: worker is 1 (out of 1 available) 30583 1726853688.76610: exiting _queue_task() for managed_node2/service 30583 1726853688.76619: done queuing things up, now waiting for results queue to drain 30583 1726853688.76621: waiting for pending results... 30583 1726853688.76866: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853688.76874: in run() - task 02083763-bbaf-05ea-abc5-00000000073b 30583 1726853688.76878: variable 'ansible_search_path' from source: unknown 30583 1726853688.76886: variable 'ansible_search_path' from source: unknown 30583 1726853688.76930: calling self._execute() 30583 1726853688.77063: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853688.77176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853688.77181: variable 'omit' from source: magic vars 30583 1726853688.77540: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.77560: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853688.77821: variable 'network_provider' from source: set_fact 30583 1726853688.77826: variable 'network_state' from source: role '' defaults 30583 1726853688.77829: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853688.77831: variable 'omit' from source: magic vars 30583 1726853688.77875: variable 'omit' from source: magic vars 30583 1726853688.77908: variable 'network_service_name' from source: role '' defaults 30583 1726853688.77992: variable 'network_service_name' from source: role '' defaults 30583 1726853688.78108: variable '__network_provider_setup' from source: role '' defaults 30583 1726853688.78119: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853688.78198: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853688.78213: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853688.78289: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853688.78526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853688.81179: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853688.81207: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853688.81247: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853688.81297: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853688.81330: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853688.81421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.81461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.81492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.81543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.81564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.81628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.81650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.81720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.81733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.81758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.81999: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853688.82114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.82158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.82265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.82268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.82273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.82354: variable 'ansible_python' from source: facts 30583 1726853688.82391: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853688.82488: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853688.82578: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853688.82730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.82805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.82808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.82842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.82866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.82927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853688.82973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853688.83024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.83060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853688.83133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853688.83244: variable 'network_connections' from source: include params 30583 1726853688.83260: variable 'interface' from source: play vars 30583 1726853688.83335: variable 'interface' from source: play vars 30583 1726853688.83485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853688.83686: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853688.83787: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853688.83805: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853688.83853: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853688.84076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853688.84079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853688.84081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853688.84083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853688.84085: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853688.84361: variable 'network_connections' from source: include params 30583 1726853688.84377: variable 'interface' from source: play vars 30583 1726853688.84463: variable 'interface' from source: play vars 30583 1726853688.84519: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853688.84612: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853688.84928: variable 'network_connections' from source: include params 30583 1726853688.84937: variable 'interface' from source: play vars 30583 1726853688.85017: variable 'interface' from source: play vars 30583 1726853688.85046: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853688.85137: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853688.85464: variable 'network_connections' from source: include params 30583 1726853688.85507: variable 'interface' from source: play vars 30583 1726853688.85559: variable 'interface' from source: play vars 30583 1726853688.85632: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853688.85699: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853688.85722: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853688.85784: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853688.86049: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853688.86538: variable 'network_connections' from source: include params 30583 1726853688.86547: variable 'interface' from source: play vars 30583 1726853688.86620: variable 'interface' from source: play vars 30583 1726853688.86636: variable 'ansible_distribution' from source: facts 30583 1726853688.86677: variable '__network_rh_distros' from source: role '' defaults 30583 1726853688.86680: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.86692: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853688.86882: variable 'ansible_distribution' from source: facts 30583 1726853688.86890: variable '__network_rh_distros' from source: role '' defaults 30583 1726853688.86899: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.86920: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853688.87129: variable 'ansible_distribution' from source: facts 30583 1726853688.87133: variable '__network_rh_distros' from source: role '' defaults 30583 1726853688.87135: variable 'ansible_distribution_major_version' from source: facts 30583 1726853688.87166: variable 'network_provider' from source: set_fact 30583 1726853688.87198: variable 'omit' from source: magic vars 30583 1726853688.87275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853688.87279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853688.87295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853688.87313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853688.87325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853688.87363: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853688.87370: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853688.87378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853688.87476: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853688.87487: Set connection var ansible_timeout to 10 30583 1726853688.87564: Set connection var ansible_connection to ssh 30583 1726853688.87567: Set connection var ansible_shell_executable to /bin/sh 30583 1726853688.87569: Set connection var ansible_shell_type to sh 30583 1726853688.87574: Set connection var ansible_pipelining to False 30583 1726853688.87576: variable 'ansible_shell_executable' from source: unknown 30583 1726853688.87578: variable 'ansible_connection' from source: unknown 30583 1726853688.87580: variable 'ansible_module_compression' from source: unknown 30583 1726853688.87582: variable 'ansible_shell_type' from source: unknown 30583 1726853688.87584: variable 'ansible_shell_executable' from source: unknown 30583 1726853688.87586: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853688.87588: variable 'ansible_pipelining' from source: unknown 30583 1726853688.87590: variable 'ansible_timeout' from source: unknown 30583 1726853688.87601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853688.87726: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853688.87778: variable 'omit' from source: magic vars 30583 1726853688.87781: starting attempt loop 30583 1726853688.87784: running the handler 30583 1726853688.87860: variable 'ansible_facts' from source: unknown 30583 1726853688.88678: _low_level_execute_command(): starting 30583 1726853688.88691: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853688.89493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853688.89567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853688.89607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853688.89722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853688.91479: stdout chunk (state=3): >>>/root <<< 30583 1726853688.91644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853688.91647: stdout chunk (state=3): >>><<< 30583 1726853688.91650: stderr chunk (state=3): >>><<< 30583 1726853688.91694: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853688.91796: _low_level_execute_command(): starting 30583 1726853688.91800: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203 `" && echo ansible-tmp-1726853688.9170597-31687-66590606169203="` echo /root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203 `" ) && sleep 0' 30583 1726853688.92658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853688.92678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853688.92788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853688.92792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853688.92817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853688.92834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853688.92851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853688.92960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853688.95073: stdout chunk (state=3): >>>ansible-tmp-1726853688.9170597-31687-66590606169203=/root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203 <<< 30583 1726853688.95234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853688.95244: stdout chunk (state=3): >>><<< 30583 1726853688.95261: stderr chunk (state=3): >>><<< 30583 1726853688.95286: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853688.9170597-31687-66590606169203=/root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853688.95476: variable 'ansible_module_compression' from source: unknown 30583 1726853688.95480: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853688.95482: variable 'ansible_facts' from source: unknown 30583 1726853688.95666: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203/AnsiballZ_systemd.py 30583 1726853688.95911: Sending initial data 30583 1726853688.95920: Sent initial data (155 bytes) 30583 1726853688.96592: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853688.96668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853688.96688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853688.96712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853688.96820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853688.98523: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853688.98611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853688.98689: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpznbyozcv /root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203/AnsiballZ_systemd.py <<< 30583 1726853688.98693: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203/AnsiballZ_systemd.py" <<< 30583 1726853688.98774: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpznbyozcv" to remote "/root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203/AnsiballZ_systemd.py" <<< 30583 1726853689.00494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853689.00634: stderr chunk (state=3): >>><<< 30583 1726853689.00637: stdout chunk (state=3): >>><<< 30583 1726853689.00640: done transferring module to remote 30583 1726853689.00642: _low_level_execute_command(): starting 30583 1726853689.00644: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203/ /root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203/AnsiballZ_systemd.py && sleep 0' 30583 1726853689.01265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853689.01317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853689.01342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853689.01356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853689.01440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853689.01466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853689.01484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853689.01586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853689.03553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853689.03557: stdout chunk (state=3): >>><<< 30583 1726853689.03559: stderr chunk (state=3): >>><<< 30583 1726853689.03589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853689.03667: _low_level_execute_command(): starting 30583 1726853689.03683: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203/AnsiballZ_systemd.py && sleep 0' 30583 1726853689.04275: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853689.04353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853689.04381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853689.04425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853689.04522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853690.35029: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4595712", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303522304", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1789368000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 30583 1726853690.35300: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853690.38340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853690.38344: stdout chunk (state=3): >>><<< 30583 1726853690.38577: stderr chunk (state=3): >>><<< 30583 1726853690.38583: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4595712", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303522304", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1789368000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853690.38594: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853690.38597: _low_level_execute_command(): starting 30583 1726853690.38599: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853688.9170597-31687-66590606169203/ > /dev/null 2>&1 && sleep 0' 30583 1726853690.39258: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853690.39296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853690.39307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853690.39328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853690.39436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853690.42043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853690.42047: stdout chunk (state=3): >>><<< 30583 1726853690.42049: stderr chunk (state=3): >>><<< 30583 1726853690.42067: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853690.42081: handler run complete 30583 1726853690.42260: attempt loop complete, returning result 30583 1726853690.42263: _execute() done 30583 1726853690.42265: dumping result to json 30583 1726853690.42267: done dumping result, returning 30583 1726853690.42269: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-00000000073b] 30583 1726853690.42273: sending task result for task 02083763-bbaf-05ea-abc5-00000000073b 30583 1726853690.43063: done sending task result for task 02083763-bbaf-05ea-abc5-00000000073b 30583 1726853690.43066: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853690.43114: no more pending results, returning what we have 30583 1726853690.43117: results queue empty 30583 1726853690.43118: checking for any_errors_fatal 30583 1726853690.43122: done checking for any_errors_fatal 30583 1726853690.43123: checking for max_fail_percentage 30583 1726853690.43124: done checking for max_fail_percentage 30583 1726853690.43125: checking to see if all hosts have failed and the running result is not ok 30583 1726853690.43126: done checking to see if all hosts have failed 30583 1726853690.43127: getting the remaining hosts for this loop 30583 1726853690.43128: done getting the remaining hosts for this loop 30583 1726853690.43131: getting the next task for host managed_node2 30583 1726853690.43137: done getting next task for host managed_node2 30583 1726853690.43140: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853690.43145: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853690.43154: getting variables 30583 1726853690.43158: in VariableManager get_vars() 30583 1726853690.43185: Calling all_inventory to load vars for managed_node2 30583 1726853690.43188: Calling groups_inventory to load vars for managed_node2 30583 1726853690.43190: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853690.43199: Calling all_plugins_play to load vars for managed_node2 30583 1726853690.43202: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853690.43205: Calling groups_plugins_play to load vars for managed_node2 30583 1726853690.44475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853690.46076: done with get_vars() 30583 1726853690.46102: done getting variables 30583 1726853690.46157: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:34:50 -0400 (0:00:01.701) 0:00:25.799 ****** 30583 1726853690.46196: entering _queue_task() for managed_node2/service 30583 1726853690.46524: worker is 1 (out of 1 available) 30583 1726853690.46536: exiting _queue_task() for managed_node2/service 30583 1726853690.46548: done queuing things up, now waiting for results queue to drain 30583 1726853690.46549: waiting for pending results... 30583 1726853690.46990: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853690.47000: in run() - task 02083763-bbaf-05ea-abc5-00000000073c 30583 1726853690.47020: variable 'ansible_search_path' from source: unknown 30583 1726853690.47027: variable 'ansible_search_path' from source: unknown 30583 1726853690.47069: calling self._execute() 30583 1726853690.47168: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853690.47183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853690.47201: variable 'omit' from source: magic vars 30583 1726853690.47585: variable 'ansible_distribution_major_version' from source: facts 30583 1726853690.47602: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853690.47726: variable 'network_provider' from source: set_fact 30583 1726853690.47776: Evaluated conditional (network_provider == "nm"): True 30583 1726853690.47842: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853690.47936: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853690.48119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853690.50246: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853690.50349: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853690.50379: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853690.50418: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853690.50478: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853690.50737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853690.50780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853690.50876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853690.50879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853690.50882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853690.50930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853690.50961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853690.50998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853690.51043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853690.51066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853690.51117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853690.51145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853690.51178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853690.51224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853690.51244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853690.51400: variable 'network_connections' from source: include params 30583 1726853690.51417: variable 'interface' from source: play vars 30583 1726853690.51499: variable 'interface' from source: play vars 30583 1726853690.51647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853690.51760: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853690.51806: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853690.51842: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853690.51885: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853690.51931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853690.51961: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853690.52077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853690.52079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853690.52081: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853690.52327: variable 'network_connections' from source: include params 30583 1726853690.52337: variable 'interface' from source: play vars 30583 1726853690.52405: variable 'interface' from source: play vars 30583 1726853690.52458: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853690.52467: when evaluation is False, skipping this task 30583 1726853690.52483: _execute() done 30583 1726853690.52491: dumping result to json 30583 1726853690.52499: done dumping result, returning 30583 1726853690.52512: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-00000000073c] 30583 1726853690.52536: sending task result for task 02083763-bbaf-05ea-abc5-00000000073c skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853690.52711: no more pending results, returning what we have 30583 1726853690.52715: results queue empty 30583 1726853690.52716: checking for any_errors_fatal 30583 1726853690.52741: done checking for any_errors_fatal 30583 1726853690.52742: checking for max_fail_percentage 30583 1726853690.52745: done checking for max_fail_percentage 30583 1726853690.52745: checking to see if all hosts have failed and the running result is not ok 30583 1726853690.52746: done checking to see if all hosts have failed 30583 1726853690.52747: getting the remaining hosts for this loop 30583 1726853690.52749: done getting the remaining hosts for this loop 30583 1726853690.52752: getting the next task for host managed_node2 30583 1726853690.52763: done getting next task for host managed_node2 30583 1726853690.52766: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853690.52775: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853690.52784: done sending task result for task 02083763-bbaf-05ea-abc5-00000000073c 30583 1726853690.52787: WORKER PROCESS EXITING 30583 1726853690.52798: getting variables 30583 1726853690.52799: in VariableManager get_vars() 30583 1726853690.52846: Calling all_inventory to load vars for managed_node2 30583 1726853690.52849: Calling groups_inventory to load vars for managed_node2 30583 1726853690.52852: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853690.52863: Calling all_plugins_play to load vars for managed_node2 30583 1726853690.52866: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853690.52868: Calling groups_plugins_play to load vars for managed_node2 30583 1726853690.53724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853690.54728: done with get_vars() 30583 1726853690.54751: done getting variables 30583 1726853690.54815: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:34:50 -0400 (0:00:00.086) 0:00:25.885 ****** 30583 1726853690.54849: entering _queue_task() for managed_node2/service 30583 1726853690.55279: worker is 1 (out of 1 available) 30583 1726853690.55290: exiting _queue_task() for managed_node2/service 30583 1726853690.55302: done queuing things up, now waiting for results queue to drain 30583 1726853690.55303: waiting for pending results... 30583 1726853690.55510: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853690.55618: in run() - task 02083763-bbaf-05ea-abc5-00000000073d 30583 1726853690.55630: variable 'ansible_search_path' from source: unknown 30583 1726853690.55634: variable 'ansible_search_path' from source: unknown 30583 1726853690.55665: calling self._execute() 30583 1726853690.55746: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853690.55749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853690.55761: variable 'omit' from source: magic vars 30583 1726853690.56033: variable 'ansible_distribution_major_version' from source: facts 30583 1726853690.56042: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853690.56128: variable 'network_provider' from source: set_fact 30583 1726853690.56132: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853690.56134: when evaluation is False, skipping this task 30583 1726853690.56137: _execute() done 30583 1726853690.56139: dumping result to json 30583 1726853690.56142: done dumping result, returning 30583 1726853690.56150: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-00000000073d] 30583 1726853690.56154: sending task result for task 02083763-bbaf-05ea-abc5-00000000073d 30583 1726853690.56243: done sending task result for task 02083763-bbaf-05ea-abc5-00000000073d 30583 1726853690.56245: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853690.56287: no more pending results, returning what we have 30583 1726853690.56291: results queue empty 30583 1726853690.56292: checking for any_errors_fatal 30583 1726853690.56300: done checking for any_errors_fatal 30583 1726853690.56301: checking for max_fail_percentage 30583 1726853690.56303: done checking for max_fail_percentage 30583 1726853690.56304: checking to see if all hosts have failed and the running result is not ok 30583 1726853690.56304: done checking to see if all hosts have failed 30583 1726853690.56305: getting the remaining hosts for this loop 30583 1726853690.56307: done getting the remaining hosts for this loop 30583 1726853690.56310: getting the next task for host managed_node2 30583 1726853690.56319: done getting next task for host managed_node2 30583 1726853690.56322: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853690.56327: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853690.56345: getting variables 30583 1726853690.56347: in VariableManager get_vars() 30583 1726853690.56380: Calling all_inventory to load vars for managed_node2 30583 1726853690.56382: Calling groups_inventory to load vars for managed_node2 30583 1726853690.56384: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853690.56393: Calling all_plugins_play to load vars for managed_node2 30583 1726853690.56395: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853690.56398: Calling groups_plugins_play to load vars for managed_node2 30583 1726853690.57452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853690.60290: done with get_vars() 30583 1726853690.60324: done getting variables 30583 1726853690.60405: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:34:50 -0400 (0:00:00.055) 0:00:25.941 ****** 30583 1726853690.60446: entering _queue_task() for managed_node2/copy 30583 1726853690.60817: worker is 1 (out of 1 available) 30583 1726853690.60831: exiting _queue_task() for managed_node2/copy 30583 1726853690.60844: done queuing things up, now waiting for results queue to drain 30583 1726853690.60845: waiting for pending results... 30583 1726853690.61256: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853690.61325: in run() - task 02083763-bbaf-05ea-abc5-00000000073e 30583 1726853690.61345: variable 'ansible_search_path' from source: unknown 30583 1726853690.61378: variable 'ansible_search_path' from source: unknown 30583 1726853690.61403: calling self._execute() 30583 1726853690.61706: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853690.61710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853690.61712: variable 'omit' from source: magic vars 30583 1726853690.62577: variable 'ansible_distribution_major_version' from source: facts 30583 1726853690.62776: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853690.63186: variable 'network_provider' from source: set_fact 30583 1726853690.63190: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853690.63192: when evaluation is False, skipping this task 30583 1726853690.63195: _execute() done 30583 1726853690.63197: dumping result to json 30583 1726853690.63199: done dumping result, returning 30583 1726853690.63203: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-00000000073e] 30583 1726853690.63205: sending task result for task 02083763-bbaf-05ea-abc5-00000000073e 30583 1726853690.63284: done sending task result for task 02083763-bbaf-05ea-abc5-00000000073e 30583 1726853690.63287: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853690.63468: no more pending results, returning what we have 30583 1726853690.63474: results queue empty 30583 1726853690.63476: checking for any_errors_fatal 30583 1726853690.63482: done checking for any_errors_fatal 30583 1726853690.63483: checking for max_fail_percentage 30583 1726853690.63486: done checking for max_fail_percentage 30583 1726853690.63487: checking to see if all hosts have failed and the running result is not ok 30583 1726853690.63488: done checking to see if all hosts have failed 30583 1726853690.63488: getting the remaining hosts for this loop 30583 1726853690.63491: done getting the remaining hosts for this loop 30583 1726853690.63495: getting the next task for host managed_node2 30583 1726853690.63504: done getting next task for host managed_node2 30583 1726853690.63508: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853690.63514: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853690.63535: getting variables 30583 1726853690.63538: in VariableManager get_vars() 30583 1726853690.63681: Calling all_inventory to load vars for managed_node2 30583 1726853690.63685: Calling groups_inventory to load vars for managed_node2 30583 1726853690.63688: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853690.63706: Calling all_plugins_play to load vars for managed_node2 30583 1726853690.63712: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853690.63716: Calling groups_plugins_play to load vars for managed_node2 30583 1726853690.65704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853690.67232: done with get_vars() 30583 1726853690.67257: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:34:50 -0400 (0:00:00.068) 0:00:26.010 ****** 30583 1726853690.67348: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853690.67718: worker is 1 (out of 1 available) 30583 1726853690.67730: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853690.67743: done queuing things up, now waiting for results queue to drain 30583 1726853690.67745: waiting for pending results... 30583 1726853690.68052: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853690.68191: in run() - task 02083763-bbaf-05ea-abc5-00000000073f 30583 1726853690.68219: variable 'ansible_search_path' from source: unknown 30583 1726853690.68228: variable 'ansible_search_path' from source: unknown 30583 1726853690.68269: calling self._execute() 30583 1726853690.68377: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853690.68389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853690.68427: variable 'omit' from source: magic vars 30583 1726853690.68796: variable 'ansible_distribution_major_version' from source: facts 30583 1726853690.68813: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853690.68832: variable 'omit' from source: magic vars 30583 1726853690.68976: variable 'omit' from source: magic vars 30583 1726853690.69070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853690.72286: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853690.72360: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853690.72622: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853690.72625: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853690.72628: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853690.72776: variable 'network_provider' from source: set_fact 30583 1726853690.73027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853690.73376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853690.73380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853690.73382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853690.73384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853690.73386: variable 'omit' from source: magic vars 30583 1726853690.73649: variable 'omit' from source: magic vars 30583 1726853690.73940: variable 'network_connections' from source: include params 30583 1726853690.73959: variable 'interface' from source: play vars 30583 1726853690.74024: variable 'interface' from source: play vars 30583 1726853690.74346: variable 'omit' from source: magic vars 30583 1726853690.74488: variable '__lsr_ansible_managed' from source: task vars 30583 1726853690.74549: variable '__lsr_ansible_managed' from source: task vars 30583 1726853690.74808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853690.75024: Loaded config def from plugin (lookup/template) 30583 1726853690.75036: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853690.75067: File lookup term: get_ansible_managed.j2 30583 1726853690.75078: variable 'ansible_search_path' from source: unknown 30583 1726853690.75088: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853690.75106: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853690.75134: variable 'ansible_search_path' from source: unknown 30583 1726853690.81240: variable 'ansible_managed' from source: unknown 30583 1726853690.81377: variable 'omit' from source: magic vars 30583 1726853690.81416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853690.81448: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853690.81477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853690.81499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853690.81518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853690.81550: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853690.81558: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853690.81566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853690.81697: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853690.81709: Set connection var ansible_timeout to 10 30583 1726853690.81716: Set connection var ansible_connection to ssh 30583 1726853690.81730: Set connection var ansible_shell_executable to /bin/sh 30583 1726853690.81737: Set connection var ansible_shell_type to sh 30583 1726853690.81750: Set connection var ansible_pipelining to False 30583 1726853690.81781: variable 'ansible_shell_executable' from source: unknown 30583 1726853690.81789: variable 'ansible_connection' from source: unknown 30583 1726853690.81837: variable 'ansible_module_compression' from source: unknown 30583 1726853690.81840: variable 'ansible_shell_type' from source: unknown 30583 1726853690.81843: variable 'ansible_shell_executable' from source: unknown 30583 1726853690.81845: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853690.81847: variable 'ansible_pipelining' from source: unknown 30583 1726853690.81849: variable 'ansible_timeout' from source: unknown 30583 1726853690.81851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853690.81969: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853690.81995: variable 'omit' from source: magic vars 30583 1726853690.82055: starting attempt loop 30583 1726853690.82058: running the handler 30583 1726853690.82060: _low_level_execute_command(): starting 30583 1726853690.82062: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853690.82744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853690.82760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853690.82864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853690.82869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853690.82895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853690.82910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853690.82932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853690.83040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853690.84753: stdout chunk (state=3): >>>/root <<< 30583 1726853690.84907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853690.84919: stdout chunk (state=3): >>><<< 30583 1726853690.84936: stderr chunk (state=3): >>><<< 30583 1726853690.85062: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853690.85066: _low_level_execute_command(): starting 30583 1726853690.85070: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347 `" && echo ansible-tmp-1726853690.849679-31766-154518877208347="` echo /root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347 `" ) && sleep 0' 30583 1726853690.85625: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853690.85639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853690.85653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853690.85674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853690.85727: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853690.85788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853690.85807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853690.85834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853690.85944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853690.87931: stdout chunk (state=3): >>>ansible-tmp-1726853690.849679-31766-154518877208347=/root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347 <<< 30583 1726853690.88079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853690.88083: stdout chunk (state=3): >>><<< 30583 1726853690.88086: stderr chunk (state=3): >>><<< 30583 1726853690.88177: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853690.849679-31766-154518877208347=/root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853690.88184: variable 'ansible_module_compression' from source: unknown 30583 1726853690.88209: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853690.88242: variable 'ansible_facts' from source: unknown 30583 1726853690.88364: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347/AnsiballZ_network_connections.py 30583 1726853690.88620: Sending initial data 30583 1726853690.88623: Sent initial data (167 bytes) 30583 1726853690.89217: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853690.89292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853690.89345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853690.89365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853690.89688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853690.89903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853690.91653: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853690.91679: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853690.91741: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmprjaxdxom /root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347/AnsiballZ_network_connections.py <<< 30583 1726853690.91745: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347/AnsiballZ_network_connections.py" <<< 30583 1726853690.91977: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmprjaxdxom" to remote "/root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347/AnsiballZ_network_connections.py" <<< 30583 1726853690.93967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853690.94133: stderr chunk (state=3): >>><<< 30583 1726853690.94136: stdout chunk (state=3): >>><<< 30583 1726853690.94162: done transferring module to remote 30583 1726853690.94175: _low_level_execute_command(): starting 30583 1726853690.94183: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347/ /root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347/AnsiballZ_network_connections.py && sleep 0' 30583 1726853690.95294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853690.95297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853690.95314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853690.95320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853690.95348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853690.95354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853690.95684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853690.95707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853690.95801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853690.97749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853690.97794: stderr chunk (state=3): >>><<< 30583 1726853690.97798: stdout chunk (state=3): >>><<< 30583 1726853690.97816: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853690.97819: _low_level_execute_command(): starting 30583 1726853690.97825: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347/AnsiballZ_network_connections.py && sleep 0' 30583 1726853690.98978: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853690.98997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853690.99076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853690.99079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853690.99083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853690.99195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853690.99202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853690.99219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853690.99328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853691.25954: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 082d2e42-0ca8-4d06-a689-24a49f64d485\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30583 1726853691.27934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853691.28278: stderr chunk (state=3): >>><<< 30583 1726853691.28282: stdout chunk (state=3): >>><<< 30583 1726853691.28285: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 082d2e42-0ca8-4d06-a689-24a49f64d485\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853691.28289: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'autoconnect': False, 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853691.28291: _low_level_execute_command(): starting 30583 1726853691.28293: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853690.849679-31766-154518877208347/ > /dev/null 2>&1 && sleep 0' 30583 1726853691.29091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853691.29162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853691.29262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853691.31173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853691.31195: stderr chunk (state=3): >>><<< 30583 1726853691.31201: stdout chunk (state=3): >>><<< 30583 1726853691.31217: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853691.31224: handler run complete 30583 1726853691.31273: attempt loop complete, returning result 30583 1726853691.31276: _execute() done 30583 1726853691.31279: dumping result to json 30583 1726853691.31284: done dumping result, returning 30583 1726853691.31293: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-00000000073f] 30583 1726853691.31297: sending task result for task 02083763-bbaf-05ea-abc5-00000000073f 30583 1726853691.31402: done sending task result for task 02083763-bbaf-05ea-abc5-00000000073f 30583 1726853691.31405: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 082d2e42-0ca8-4d06-a689-24a49f64d485 30583 1726853691.31540: no more pending results, returning what we have 30583 1726853691.31544: results queue empty 30583 1726853691.31545: checking for any_errors_fatal 30583 1726853691.31554: done checking for any_errors_fatal 30583 1726853691.31557: checking for max_fail_percentage 30583 1726853691.31559: done checking for max_fail_percentage 30583 1726853691.31560: checking to see if all hosts have failed and the running result is not ok 30583 1726853691.31561: done checking to see if all hosts have failed 30583 1726853691.31561: getting the remaining hosts for this loop 30583 1726853691.31563: done getting the remaining hosts for this loop 30583 1726853691.31567: getting the next task for host managed_node2 30583 1726853691.31576: done getting next task for host managed_node2 30583 1726853691.31580: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853691.31584: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853691.31595: getting variables 30583 1726853691.31597: in VariableManager get_vars() 30583 1726853691.31777: Calling all_inventory to load vars for managed_node2 30583 1726853691.31780: Calling groups_inventory to load vars for managed_node2 30583 1726853691.31783: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853691.31792: Calling all_plugins_play to load vars for managed_node2 30583 1726853691.31795: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853691.31798: Calling groups_plugins_play to load vars for managed_node2 30583 1726853691.32899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853691.33876: done with get_vars() 30583 1726853691.33891: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:34:51 -0400 (0:00:00.666) 0:00:26.676 ****** 30583 1726853691.33957: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853691.34207: worker is 1 (out of 1 available) 30583 1726853691.34221: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853691.34236: done queuing things up, now waiting for results queue to drain 30583 1726853691.34237: waiting for pending results... 30583 1726853691.34687: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853691.34692: in run() - task 02083763-bbaf-05ea-abc5-000000000740 30583 1726853691.34696: variable 'ansible_search_path' from source: unknown 30583 1726853691.34699: variable 'ansible_search_path' from source: unknown 30583 1726853691.34707: calling self._execute() 30583 1726853691.34768: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853691.34782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853691.34797: variable 'omit' from source: magic vars 30583 1726853691.35174: variable 'ansible_distribution_major_version' from source: facts 30583 1726853691.35192: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853691.35314: variable 'network_state' from source: role '' defaults 30583 1726853691.35331: Evaluated conditional (network_state != {}): False 30583 1726853691.35339: when evaluation is False, skipping this task 30583 1726853691.35346: _execute() done 30583 1726853691.35353: dumping result to json 30583 1726853691.35360: done dumping result, returning 30583 1726853691.35375: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-000000000740] 30583 1726853691.35386: sending task result for task 02083763-bbaf-05ea-abc5-000000000740 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853691.35539: no more pending results, returning what we have 30583 1726853691.35545: results queue empty 30583 1726853691.35546: checking for any_errors_fatal 30583 1726853691.35556: done checking for any_errors_fatal 30583 1726853691.35557: checking for max_fail_percentage 30583 1726853691.35559: done checking for max_fail_percentage 30583 1726853691.35560: checking to see if all hosts have failed and the running result is not ok 30583 1726853691.35561: done checking to see if all hosts have failed 30583 1726853691.35562: getting the remaining hosts for this loop 30583 1726853691.35563: done getting the remaining hosts for this loop 30583 1726853691.35567: getting the next task for host managed_node2 30583 1726853691.35576: done getting next task for host managed_node2 30583 1726853691.35580: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853691.35584: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853691.35610: getting variables 30583 1726853691.35611: in VariableManager get_vars() 30583 1726853691.35646: Calling all_inventory to load vars for managed_node2 30583 1726853691.35649: Calling groups_inventory to load vars for managed_node2 30583 1726853691.35651: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853691.35666: Calling all_plugins_play to load vars for managed_node2 30583 1726853691.35669: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853691.35676: done sending task result for task 02083763-bbaf-05ea-abc5-000000000740 30583 1726853691.35679: WORKER PROCESS EXITING 30583 1726853691.35788: Calling groups_plugins_play to load vars for managed_node2 30583 1726853691.37152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853691.38887: done with get_vars() 30583 1726853691.38919: done getting variables 30583 1726853691.38994: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:34:51 -0400 (0:00:00.050) 0:00:26.727 ****** 30583 1726853691.39038: entering _queue_task() for managed_node2/debug 30583 1726853691.39421: worker is 1 (out of 1 available) 30583 1726853691.39435: exiting _queue_task() for managed_node2/debug 30583 1726853691.39447: done queuing things up, now waiting for results queue to drain 30583 1726853691.39448: waiting for pending results... 30583 1726853691.39769: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853691.40010: in run() - task 02083763-bbaf-05ea-abc5-000000000741 30583 1726853691.40015: variable 'ansible_search_path' from source: unknown 30583 1726853691.40017: variable 'ansible_search_path' from source: unknown 30583 1726853691.40020: calling self._execute() 30583 1726853691.40121: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853691.40134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853691.40147: variable 'omit' from source: magic vars 30583 1726853691.40563: variable 'ansible_distribution_major_version' from source: facts 30583 1726853691.40582: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853691.40593: variable 'omit' from source: magic vars 30583 1726853691.40677: variable 'omit' from source: magic vars 30583 1726853691.40716: variable 'omit' from source: magic vars 30583 1726853691.40762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853691.40883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853691.40886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853691.40888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853691.40891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853691.40915: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853691.40923: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853691.40931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853691.41048: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853691.41062: Set connection var ansible_timeout to 10 30583 1726853691.41069: Set connection var ansible_connection to ssh 30583 1726853691.41081: Set connection var ansible_shell_executable to /bin/sh 30583 1726853691.41088: Set connection var ansible_shell_type to sh 30583 1726853691.41114: Set connection var ansible_pipelining to False 30583 1726853691.41144: variable 'ansible_shell_executable' from source: unknown 30583 1726853691.41151: variable 'ansible_connection' from source: unknown 30583 1726853691.41207: variable 'ansible_module_compression' from source: unknown 30583 1726853691.41210: variable 'ansible_shell_type' from source: unknown 30583 1726853691.41212: variable 'ansible_shell_executable' from source: unknown 30583 1726853691.41214: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853691.41221: variable 'ansible_pipelining' from source: unknown 30583 1726853691.41223: variable 'ansible_timeout' from source: unknown 30583 1726853691.41225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853691.41369: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853691.41388: variable 'omit' from source: magic vars 30583 1726853691.41398: starting attempt loop 30583 1726853691.41405: running the handler 30583 1726853691.41644: variable '__network_connections_result' from source: set_fact 30583 1726853691.41647: handler run complete 30583 1726853691.41651: attempt loop complete, returning result 30583 1726853691.41653: _execute() done 30583 1726853691.41658: dumping result to json 30583 1726853691.41667: done dumping result, returning 30583 1726853691.41683: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-000000000741] 30583 1726853691.41692: sending task result for task 02083763-bbaf-05ea-abc5-000000000741 30583 1726853691.42021: done sending task result for task 02083763-bbaf-05ea-abc5-000000000741 30583 1726853691.42025: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 082d2e42-0ca8-4d06-a689-24a49f64d485" ] } 30583 1726853691.42101: no more pending results, returning what we have 30583 1726853691.42105: results queue empty 30583 1726853691.42106: checking for any_errors_fatal 30583 1726853691.42112: done checking for any_errors_fatal 30583 1726853691.42113: checking for max_fail_percentage 30583 1726853691.42114: done checking for max_fail_percentage 30583 1726853691.42115: checking to see if all hosts have failed and the running result is not ok 30583 1726853691.42116: done checking to see if all hosts have failed 30583 1726853691.42117: getting the remaining hosts for this loop 30583 1726853691.42119: done getting the remaining hosts for this loop 30583 1726853691.42122: getting the next task for host managed_node2 30583 1726853691.42129: done getting next task for host managed_node2 30583 1726853691.42139: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853691.42144: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853691.42160: getting variables 30583 1726853691.42162: in VariableManager get_vars() 30583 1726853691.42200: Calling all_inventory to load vars for managed_node2 30583 1726853691.42204: Calling groups_inventory to load vars for managed_node2 30583 1726853691.42206: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853691.42216: Calling all_plugins_play to load vars for managed_node2 30583 1726853691.42219: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853691.42222: Calling groups_plugins_play to load vars for managed_node2 30583 1726853691.43961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853691.45688: done with get_vars() 30583 1726853691.45723: done getting variables 30583 1726853691.45792: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:34:51 -0400 (0:00:00.068) 0:00:26.795 ****** 30583 1726853691.45844: entering _queue_task() for managed_node2/debug 30583 1726853691.46220: worker is 1 (out of 1 available) 30583 1726853691.46234: exiting _queue_task() for managed_node2/debug 30583 1726853691.46246: done queuing things up, now waiting for results queue to drain 30583 1726853691.46247: waiting for pending results... 30583 1726853691.46597: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853691.46726: in run() - task 02083763-bbaf-05ea-abc5-000000000742 30583 1726853691.46746: variable 'ansible_search_path' from source: unknown 30583 1726853691.46754: variable 'ansible_search_path' from source: unknown 30583 1726853691.46807: calling self._execute() 30583 1726853691.46977: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853691.46981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853691.46984: variable 'omit' from source: magic vars 30583 1726853691.47350: variable 'ansible_distribution_major_version' from source: facts 30583 1726853691.47370: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853691.47384: variable 'omit' from source: magic vars 30583 1726853691.47462: variable 'omit' from source: magic vars 30583 1726853691.47504: variable 'omit' from source: magic vars 30583 1726853691.47553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853691.47767: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853691.47770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853691.47780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853691.47784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853691.47786: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853691.47789: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853691.47791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853691.47836: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853691.47848: Set connection var ansible_timeout to 10 30583 1726853691.47858: Set connection var ansible_connection to ssh 30583 1726853691.47870: Set connection var ansible_shell_executable to /bin/sh 30583 1726853691.47977: Set connection var ansible_shell_type to sh 30583 1726853691.47982: Set connection var ansible_pipelining to False 30583 1726853691.47984: variable 'ansible_shell_executable' from source: unknown 30583 1726853691.47986: variable 'ansible_connection' from source: unknown 30583 1726853691.47991: variable 'ansible_module_compression' from source: unknown 30583 1726853691.47993: variable 'ansible_shell_type' from source: unknown 30583 1726853691.47995: variable 'ansible_shell_executable' from source: unknown 30583 1726853691.47997: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853691.47999: variable 'ansible_pipelining' from source: unknown 30583 1726853691.48000: variable 'ansible_timeout' from source: unknown 30583 1726853691.48002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853691.48177: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853691.48181: variable 'omit' from source: magic vars 30583 1726853691.48183: starting attempt loop 30583 1726853691.48185: running the handler 30583 1726853691.48276: variable '__network_connections_result' from source: set_fact 30583 1726853691.48320: variable '__network_connections_result' from source: set_fact 30583 1726853691.48462: handler run complete 30583 1726853691.48499: attempt loop complete, returning result 30583 1726853691.48506: _execute() done 30583 1726853691.48512: dumping result to json 30583 1726853691.48520: done dumping result, returning 30583 1726853691.48533: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-000000000742] 30583 1726853691.48541: sending task result for task 02083763-bbaf-05ea-abc5-000000000742 30583 1726853691.48789: done sending task result for task 02083763-bbaf-05ea-abc5-000000000742 30583 1726853691.48792: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 082d2e42-0ca8-4d06-a689-24a49f64d485\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 082d2e42-0ca8-4d06-a689-24a49f64d485" ] } } 30583 1726853691.48983: no more pending results, returning what we have 30583 1726853691.48987: results queue empty 30583 1726853691.48988: checking for any_errors_fatal 30583 1726853691.49000: done checking for any_errors_fatal 30583 1726853691.49001: checking for max_fail_percentage 30583 1726853691.49003: done checking for max_fail_percentage 30583 1726853691.49004: checking to see if all hosts have failed and the running result is not ok 30583 1726853691.49005: done checking to see if all hosts have failed 30583 1726853691.49006: getting the remaining hosts for this loop 30583 1726853691.49007: done getting the remaining hosts for this loop 30583 1726853691.49011: getting the next task for host managed_node2 30583 1726853691.49019: done getting next task for host managed_node2 30583 1726853691.49022: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853691.49027: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853691.49040: getting variables 30583 1726853691.49049: in VariableManager get_vars() 30583 1726853691.49191: Calling all_inventory to load vars for managed_node2 30583 1726853691.49194: Calling groups_inventory to load vars for managed_node2 30583 1726853691.49196: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853691.49206: Calling all_plugins_play to load vars for managed_node2 30583 1726853691.49209: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853691.49219: Calling groups_plugins_play to load vars for managed_node2 30583 1726853691.50705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853691.52412: done with get_vars() 30583 1726853691.52441: done getting variables 30583 1726853691.52514: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:34:51 -0400 (0:00:00.067) 0:00:26.862 ****** 30583 1726853691.52550: entering _queue_task() for managed_node2/debug 30583 1726853691.52996: worker is 1 (out of 1 available) 30583 1726853691.53010: exiting _queue_task() for managed_node2/debug 30583 1726853691.53080: done queuing things up, now waiting for results queue to drain 30583 1726853691.53082: waiting for pending results... 30583 1726853691.53296: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853691.53526: in run() - task 02083763-bbaf-05ea-abc5-000000000743 30583 1726853691.53531: variable 'ansible_search_path' from source: unknown 30583 1726853691.53534: variable 'ansible_search_path' from source: unknown 30583 1726853691.53548: calling self._execute() 30583 1726853691.53659: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853691.53741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853691.53745: variable 'omit' from source: magic vars 30583 1726853691.54115: variable 'ansible_distribution_major_version' from source: facts 30583 1726853691.54133: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853691.54276: variable 'network_state' from source: role '' defaults 30583 1726853691.54296: Evaluated conditional (network_state != {}): False 30583 1726853691.54304: when evaluation is False, skipping this task 30583 1726853691.54311: _execute() done 30583 1726853691.54318: dumping result to json 30583 1726853691.54334: done dumping result, returning 30583 1726853691.54377: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-000000000743] 30583 1726853691.54381: sending task result for task 02083763-bbaf-05ea-abc5-000000000743 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853691.54515: no more pending results, returning what we have 30583 1726853691.54520: results queue empty 30583 1726853691.54521: checking for any_errors_fatal 30583 1726853691.54530: done checking for any_errors_fatal 30583 1726853691.54531: checking for max_fail_percentage 30583 1726853691.54533: done checking for max_fail_percentage 30583 1726853691.54534: checking to see if all hosts have failed and the running result is not ok 30583 1726853691.54535: done checking to see if all hosts have failed 30583 1726853691.54536: getting the remaining hosts for this loop 30583 1726853691.54538: done getting the remaining hosts for this loop 30583 1726853691.54542: getting the next task for host managed_node2 30583 1726853691.54550: done getting next task for host managed_node2 30583 1726853691.54558: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853691.54563: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853691.54792: getting variables 30583 1726853691.54795: in VariableManager get_vars() 30583 1726853691.54832: Calling all_inventory to load vars for managed_node2 30583 1726853691.54835: Calling groups_inventory to load vars for managed_node2 30583 1726853691.54838: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853691.54850: Calling all_plugins_play to load vars for managed_node2 30583 1726853691.54853: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853691.54859: Calling groups_plugins_play to load vars for managed_node2 30583 1726853691.55396: done sending task result for task 02083763-bbaf-05ea-abc5-000000000743 30583 1726853691.55400: WORKER PROCESS EXITING 30583 1726853691.61348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853691.62994: done with get_vars() 30583 1726853691.63024: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:34:51 -0400 (0:00:00.105) 0:00:26.968 ****** 30583 1726853691.63120: entering _queue_task() for managed_node2/ping 30583 1726853691.63499: worker is 1 (out of 1 available) 30583 1726853691.63514: exiting _queue_task() for managed_node2/ping 30583 1726853691.63526: done queuing things up, now waiting for results queue to drain 30583 1726853691.63528: waiting for pending results... 30583 1726853691.63748: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853691.63895: in run() - task 02083763-bbaf-05ea-abc5-000000000744 30583 1726853691.63907: variable 'ansible_search_path' from source: unknown 30583 1726853691.63912: variable 'ansible_search_path' from source: unknown 30583 1726853691.63950: calling self._execute() 30583 1726853691.64048: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853691.64057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853691.64063: variable 'omit' from source: magic vars 30583 1726853691.64431: variable 'ansible_distribution_major_version' from source: facts 30583 1726853691.64442: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853691.64448: variable 'omit' from source: magic vars 30583 1726853691.64510: variable 'omit' from source: magic vars 30583 1726853691.64544: variable 'omit' from source: magic vars 30583 1726853691.64673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853691.64678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853691.64680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853691.64684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853691.64686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853691.64700: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853691.64703: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853691.64706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853691.64804: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853691.64811: Set connection var ansible_timeout to 10 30583 1726853691.64814: Set connection var ansible_connection to ssh 30583 1726853691.64820: Set connection var ansible_shell_executable to /bin/sh 30583 1726853691.64823: Set connection var ansible_shell_type to sh 30583 1726853691.64832: Set connection var ansible_pipelining to False 30583 1726853691.64853: variable 'ansible_shell_executable' from source: unknown 30583 1726853691.64858: variable 'ansible_connection' from source: unknown 30583 1726853691.64861: variable 'ansible_module_compression' from source: unknown 30583 1726853691.64866: variable 'ansible_shell_type' from source: unknown 30583 1726853691.64868: variable 'ansible_shell_executable' from source: unknown 30583 1726853691.64872: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853691.64875: variable 'ansible_pipelining' from source: unknown 30583 1726853691.64877: variable 'ansible_timeout' from source: unknown 30583 1726853691.64879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853691.65031: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853691.65041: variable 'omit' from source: magic vars 30583 1726853691.65045: starting attempt loop 30583 1726853691.65048: running the handler 30583 1726853691.65063: _low_level_execute_command(): starting 30583 1726853691.65066: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853691.65544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853691.65580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853691.65584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853691.65586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853691.65589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853691.65633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853691.65636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853691.65726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853691.67488: stdout chunk (state=3): >>>/root <<< 30583 1726853691.67593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853691.67623: stderr chunk (state=3): >>><<< 30583 1726853691.67626: stdout chunk (state=3): >>><<< 30583 1726853691.67641: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853691.67674: _low_level_execute_command(): starting 30583 1726853691.67677: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601 `" && echo ansible-tmp-1726853691.676466-31802-9269622525601="` echo /root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601 `" ) && sleep 0' 30583 1726853691.68080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853691.68083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853691.68095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853691.68097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853691.68142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853691.68145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853691.68224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853691.70264: stdout chunk (state=3): >>>ansible-tmp-1726853691.676466-31802-9269622525601=/root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601 <<< 30583 1726853691.70368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853691.70393: stderr chunk (state=3): >>><<< 30583 1726853691.70397: stdout chunk (state=3): >>><<< 30583 1726853691.70412: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853691.676466-31802-9269622525601=/root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853691.70453: variable 'ansible_module_compression' from source: unknown 30583 1726853691.70491: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853691.70522: variable 'ansible_facts' from source: unknown 30583 1726853691.70583: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601/AnsiballZ_ping.py 30583 1726853691.70686: Sending initial data 30583 1726853691.70689: Sent initial data (150 bytes) 30583 1726853691.71186: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853691.71231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853691.71247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853691.71268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853691.71377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853691.73043: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30583 1726853691.73050: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853691.73113: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853691.73182: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp8j59n7cp /root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601/AnsiballZ_ping.py <<< 30583 1726853691.73189: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601/AnsiballZ_ping.py" <<< 30583 1726853691.73253: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp8j59n7cp" to remote "/root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601/AnsiballZ_ping.py" <<< 30583 1726853691.73259: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601/AnsiballZ_ping.py" <<< 30583 1726853691.74074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853691.74078: stderr chunk (state=3): >>><<< 30583 1726853691.74080: stdout chunk (state=3): >>><<< 30583 1726853691.74082: done transferring module to remote 30583 1726853691.74084: _low_level_execute_command(): starting 30583 1726853691.74086: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601/ /root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601/AnsiballZ_ping.py && sleep 0' 30583 1726853691.74623: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853691.74637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853691.74659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853691.74663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853691.74733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853691.74737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853691.74788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853691.74791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853691.74866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853691.76738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853691.76761: stderr chunk (state=3): >>><<< 30583 1726853691.76765: stdout chunk (state=3): >>><<< 30583 1726853691.76778: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853691.76782: _low_level_execute_command(): starting 30583 1726853691.76786: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601/AnsiballZ_ping.py && sleep 0' 30583 1726853691.77205: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853691.77208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853691.77210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853691.77213: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853691.77215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853691.77266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853691.77272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853691.77348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853691.92829: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853691.94246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853691.94282: stderr chunk (state=3): >>><<< 30583 1726853691.94286: stdout chunk (state=3): >>><<< 30583 1726853691.94298: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853691.94320: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853691.94341: _low_level_execute_command(): starting 30583 1726853691.94345: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853691.676466-31802-9269622525601/ > /dev/null 2>&1 && sleep 0' 30583 1726853691.94879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853691.94883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853691.94886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853691.94898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853691.94953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853691.94957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853691.94959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853691.95039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853691.97053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853691.97128: stderr chunk (state=3): >>><<< 30583 1726853691.97132: stdout chunk (state=3): >>><<< 30583 1726853691.97194: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853691.97197: handler run complete 30583 1726853691.97200: attempt loop complete, returning result 30583 1726853691.97203: _execute() done 30583 1726853691.97205: dumping result to json 30583 1726853691.97209: done dumping result, returning 30583 1726853691.97211: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-000000000744] 30583 1726853691.97213: sending task result for task 02083763-bbaf-05ea-abc5-000000000744 30583 1726853691.97320: done sending task result for task 02083763-bbaf-05ea-abc5-000000000744 30583 1726853691.97324: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853691.97397: no more pending results, returning what we have 30583 1726853691.97401: results queue empty 30583 1726853691.97402: checking for any_errors_fatal 30583 1726853691.97410: done checking for any_errors_fatal 30583 1726853691.97411: checking for max_fail_percentage 30583 1726853691.97413: done checking for max_fail_percentage 30583 1726853691.97413: checking to see if all hosts have failed and the running result is not ok 30583 1726853691.97414: done checking to see if all hosts have failed 30583 1726853691.97414: getting the remaining hosts for this loop 30583 1726853691.97416: done getting the remaining hosts for this loop 30583 1726853691.97420: getting the next task for host managed_node2 30583 1726853691.97430: done getting next task for host managed_node2 30583 1726853691.97433: ^ task is: TASK: meta (role_complete) 30583 1726853691.97438: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853691.97450: getting variables 30583 1726853691.97451: in VariableManager get_vars() 30583 1726853691.97531: Calling all_inventory to load vars for managed_node2 30583 1726853691.97534: Calling groups_inventory to load vars for managed_node2 30583 1726853691.97536: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853691.97545: Calling all_plugins_play to load vars for managed_node2 30583 1726853691.97548: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853691.97550: Calling groups_plugins_play to load vars for managed_node2 30583 1726853691.98831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853692.01102: done with get_vars() 30583 1726853692.01127: done getting variables 30583 1726853692.01228: done queuing things up, now waiting for results queue to drain 30583 1726853692.01230: results queue empty 30583 1726853692.01231: checking for any_errors_fatal 30583 1726853692.01234: done checking for any_errors_fatal 30583 1726853692.01234: checking for max_fail_percentage 30583 1726853692.01236: done checking for max_fail_percentage 30583 1726853692.01236: checking to see if all hosts have failed and the running result is not ok 30583 1726853692.01237: done checking to see if all hosts have failed 30583 1726853692.01238: getting the remaining hosts for this loop 30583 1726853692.01239: done getting the remaining hosts for this loop 30583 1726853692.01241: getting the next task for host managed_node2 30583 1726853692.01246: done getting next task for host managed_node2 30583 1726853692.01249: ^ task is: TASK: Show result 30583 1726853692.01252: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853692.01254: getting variables 30583 1726853692.01258: in VariableManager get_vars() 30583 1726853692.01312: Calling all_inventory to load vars for managed_node2 30583 1726853692.01314: Calling groups_inventory to load vars for managed_node2 30583 1726853692.01317: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853692.01373: Calling all_plugins_play to load vars for managed_node2 30583 1726853692.01377: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853692.01381: Calling groups_plugins_play to load vars for managed_node2 30583 1726853692.03553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853692.05649: done with get_vars() 30583 1726853692.05698: done getting variables 30583 1726853692.05742: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:15 Friday 20 September 2024 13:34:52 -0400 (0:00:00.426) 0:00:27.395 ****** 30583 1726853692.05791: entering _queue_task() for managed_node2/debug 30583 1726853692.06450: worker is 1 (out of 1 available) 30583 1726853692.06464: exiting _queue_task() for managed_node2/debug 30583 1726853692.06478: done queuing things up, now waiting for results queue to drain 30583 1726853692.06479: waiting for pending results... 30583 1726853692.06888: running TaskExecutor() for managed_node2/TASK: Show result 30583 1726853692.06912: in run() - task 02083763-bbaf-05ea-abc5-0000000006b2 30583 1726853692.06939: variable 'ansible_search_path' from source: unknown 30583 1726853692.06949: variable 'ansible_search_path' from source: unknown 30583 1726853692.06996: calling self._execute() 30583 1726853692.07107: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.07123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.07276: variable 'omit' from source: magic vars 30583 1726853692.07547: variable 'ansible_distribution_major_version' from source: facts 30583 1726853692.07568: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853692.07582: variable 'omit' from source: magic vars 30583 1726853692.07633: variable 'omit' from source: magic vars 30583 1726853692.07681: variable 'omit' from source: magic vars 30583 1726853692.07724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853692.07766: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853692.07792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853692.07815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853692.07834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853692.07874: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853692.07888: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.07902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.08008: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853692.08026: Set connection var ansible_timeout to 10 30583 1726853692.08034: Set connection var ansible_connection to ssh 30583 1726853692.08046: Set connection var ansible_shell_executable to /bin/sh 30583 1726853692.08057: Set connection var ansible_shell_type to sh 30583 1726853692.08078: Set connection var ansible_pipelining to False 30583 1726853692.08112: variable 'ansible_shell_executable' from source: unknown 30583 1726853692.08121: variable 'ansible_connection' from source: unknown 30583 1726853692.08275: variable 'ansible_module_compression' from source: unknown 30583 1726853692.08278: variable 'ansible_shell_type' from source: unknown 30583 1726853692.08281: variable 'ansible_shell_executable' from source: unknown 30583 1726853692.08283: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.08285: variable 'ansible_pipelining' from source: unknown 30583 1726853692.08287: variable 'ansible_timeout' from source: unknown 30583 1726853692.08289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.08323: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853692.08346: variable 'omit' from source: magic vars 30583 1726853692.08360: starting attempt loop 30583 1726853692.08366: running the handler 30583 1726853692.08422: variable '__network_connections_result' from source: set_fact 30583 1726853692.08512: variable '__network_connections_result' from source: set_fact 30583 1726853692.08642: handler run complete 30583 1726853692.08683: attempt loop complete, returning result 30583 1726853692.08690: _execute() done 30583 1726853692.08697: dumping result to json 30583 1726853692.08706: done dumping result, returning 30583 1726853692.08718: done running TaskExecutor() for managed_node2/TASK: Show result [02083763-bbaf-05ea-abc5-0000000006b2] 30583 1726853692.08727: sending task result for task 02083763-bbaf-05ea-abc5-0000000006b2 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 082d2e42-0ca8-4d06-a689-24a49f64d485\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 082d2e42-0ca8-4d06-a689-24a49f64d485" ] } } 30583 1726853692.09058: no more pending results, returning what we have 30583 1726853692.09062: results queue empty 30583 1726853692.09063: checking for any_errors_fatal 30583 1726853692.09064: done checking for any_errors_fatal 30583 1726853692.09065: checking for max_fail_percentage 30583 1726853692.09067: done checking for max_fail_percentage 30583 1726853692.09067: checking to see if all hosts have failed and the running result is not ok 30583 1726853692.09068: done checking to see if all hosts have failed 30583 1726853692.09069: getting the remaining hosts for this loop 30583 1726853692.09072: done getting the remaining hosts for this loop 30583 1726853692.09075: getting the next task for host managed_node2 30583 1726853692.09084: done getting next task for host managed_node2 30583 1726853692.09087: ^ task is: TASK: Asserts 30583 1726853692.09090: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853692.09094: getting variables 30583 1726853692.09095: in VariableManager get_vars() 30583 1726853692.09126: Calling all_inventory to load vars for managed_node2 30583 1726853692.09128: Calling groups_inventory to load vars for managed_node2 30583 1726853692.09131: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853692.09208: Calling all_plugins_play to load vars for managed_node2 30583 1726853692.09212: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853692.09218: done sending task result for task 02083763-bbaf-05ea-abc5-0000000006b2 30583 1726853692.09220: WORKER PROCESS EXITING 30583 1726853692.09224: Calling groups_plugins_play to load vars for managed_node2 30583 1726853692.10526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853692.12234: done with get_vars() 30583 1726853692.12256: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 13:34:52 -0400 (0:00:00.065) 0:00:27.460 ****** 30583 1726853692.12364: entering _queue_task() for managed_node2/include_tasks 30583 1726853692.12814: worker is 1 (out of 1 available) 30583 1726853692.12827: exiting _queue_task() for managed_node2/include_tasks 30583 1726853692.12839: done queuing things up, now waiting for results queue to drain 30583 1726853692.12841: waiting for pending results... 30583 1726853692.13069: running TaskExecutor() for managed_node2/TASK: Asserts 30583 1726853692.13196: in run() - task 02083763-bbaf-05ea-abc5-0000000005b9 30583 1726853692.13210: variable 'ansible_search_path' from source: unknown 30583 1726853692.13214: variable 'ansible_search_path' from source: unknown 30583 1726853692.13276: variable 'lsr_assert' from source: include params 30583 1726853692.13518: variable 'lsr_assert' from source: include params 30583 1726853692.13603: variable 'omit' from source: magic vars 30583 1726853692.13751: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.13762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.13775: variable 'omit' from source: magic vars 30583 1726853692.14068: variable 'ansible_distribution_major_version' from source: facts 30583 1726853692.14073: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853692.14076: variable 'item' from source: unknown 30583 1726853692.14356: variable 'item' from source: unknown 30583 1726853692.14359: variable 'item' from source: unknown 30583 1726853692.14361: variable 'item' from source: unknown 30583 1726853692.14478: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.14481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.14484: variable 'omit' from source: magic vars 30583 1726853692.14816: variable 'ansible_distribution_major_version' from source: facts 30583 1726853692.14819: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853692.14822: variable 'item' from source: unknown 30583 1726853692.14824: variable 'item' from source: unknown 30583 1726853692.14826: variable 'item' from source: unknown 30583 1726853692.14834: variable 'item' from source: unknown 30583 1726853692.14904: dumping result to json 30583 1726853692.14907: done dumping result, returning 30583 1726853692.14910: done running TaskExecutor() for managed_node2/TASK: Asserts [02083763-bbaf-05ea-abc5-0000000005b9] 30583 1726853692.14912: sending task result for task 02083763-bbaf-05ea-abc5-0000000005b9 30583 1726853692.15107: done sending task result for task 02083763-bbaf-05ea-abc5-0000000005b9 30583 1726853692.15110: WORKER PROCESS EXITING 30583 1726853692.15131: no more pending results, returning what we have 30583 1726853692.15136: in VariableManager get_vars() 30583 1726853692.15165: Calling all_inventory to load vars for managed_node2 30583 1726853692.15167: Calling groups_inventory to load vars for managed_node2 30583 1726853692.15170: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853692.15181: Calling all_plugins_play to load vars for managed_node2 30583 1726853692.15184: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853692.15186: Calling groups_plugins_play to load vars for managed_node2 30583 1726853692.16426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853692.17994: done with get_vars() 30583 1726853692.18014: variable 'ansible_search_path' from source: unknown 30583 1726853692.18015: variable 'ansible_search_path' from source: unknown 30583 1726853692.18057: variable 'ansible_search_path' from source: unknown 30583 1726853692.18059: variable 'ansible_search_path' from source: unknown 30583 1726853692.18089: we have included files to process 30583 1726853692.18090: generating all_blocks data 30583 1726853692.18092: done generating all_blocks data 30583 1726853692.18097: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853692.18098: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853692.18100: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853692.18206: in VariableManager get_vars() 30583 1726853692.18225: done with get_vars() 30583 1726853692.18334: done processing included file 30583 1726853692.18336: iterating over new_blocks loaded from include file 30583 1726853692.18338: in VariableManager get_vars() 30583 1726853692.18351: done with get_vars() 30583 1726853692.18353: filtering new block on tags 30583 1726853692.18390: done filtering new block on tags 30583 1726853692.18392: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item=tasks/assert_device_absent.yml) 30583 1726853692.18397: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30583 1726853692.18398: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30583 1726853692.18400: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30583 1726853692.18496: in VariableManager get_vars() 30583 1726853692.18514: done with get_vars() 30583 1726853692.18730: done processing included file 30583 1726853692.18732: iterating over new_blocks loaded from include file 30583 1726853692.18733: in VariableManager get_vars() 30583 1726853692.18745: done with get_vars() 30583 1726853692.18747: filtering new block on tags 30583 1726853692.18798: done filtering new block on tags 30583 1726853692.18800: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=tasks/assert_profile_present.yml) 30583 1726853692.18804: extending task lists for all hosts with included blocks 30583 1726853692.19808: done extending task lists 30583 1726853692.19809: done processing included files 30583 1726853692.19810: results queue empty 30583 1726853692.19810: checking for any_errors_fatal 30583 1726853692.19815: done checking for any_errors_fatal 30583 1726853692.19816: checking for max_fail_percentage 30583 1726853692.19817: done checking for max_fail_percentage 30583 1726853692.19818: checking to see if all hosts have failed and the running result is not ok 30583 1726853692.19819: done checking to see if all hosts have failed 30583 1726853692.19819: getting the remaining hosts for this loop 30583 1726853692.19821: done getting the remaining hosts for this loop 30583 1726853692.19823: getting the next task for host managed_node2 30583 1726853692.19827: done getting next task for host managed_node2 30583 1726853692.19829: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30583 1726853692.19833: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853692.19841: getting variables 30583 1726853692.19842: in VariableManager get_vars() 30583 1726853692.19851: Calling all_inventory to load vars for managed_node2 30583 1726853692.19853: Calling groups_inventory to load vars for managed_node2 30583 1726853692.19855: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853692.19861: Calling all_plugins_play to load vars for managed_node2 30583 1726853692.19863: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853692.19866: Calling groups_plugins_play to load vars for managed_node2 30583 1726853692.21035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853692.22589: done with get_vars() 30583 1726853692.22622: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:34:52 -0400 (0:00:00.103) 0:00:27.564 ****** 30583 1726853692.22718: entering _queue_task() for managed_node2/include_tasks 30583 1726853692.23346: worker is 1 (out of 1 available) 30583 1726853692.23364: exiting _queue_task() for managed_node2/include_tasks 30583 1726853692.23399: done queuing things up, now waiting for results queue to drain 30583 1726853692.23401: waiting for pending results... 30583 1726853692.23676: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30583 1726853692.23877: in run() - task 02083763-bbaf-05ea-abc5-0000000008a8 30583 1726853692.23881: variable 'ansible_search_path' from source: unknown 30583 1726853692.23884: variable 'ansible_search_path' from source: unknown 30583 1726853692.23899: calling self._execute() 30583 1726853692.24000: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.24004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.24014: variable 'omit' from source: magic vars 30583 1726853692.24431: variable 'ansible_distribution_major_version' from source: facts 30583 1726853692.24676: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853692.24679: _execute() done 30583 1726853692.24682: dumping result to json 30583 1726853692.24684: done dumping result, returning 30583 1726853692.24686: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-05ea-abc5-0000000008a8] 30583 1726853692.24688: sending task result for task 02083763-bbaf-05ea-abc5-0000000008a8 30583 1726853692.24752: done sending task result for task 02083763-bbaf-05ea-abc5-0000000008a8 30583 1726853692.24755: WORKER PROCESS EXITING 30583 1726853692.24780: no more pending results, returning what we have 30583 1726853692.24785: in VariableManager get_vars() 30583 1726853692.24903: Calling all_inventory to load vars for managed_node2 30583 1726853692.24906: Calling groups_inventory to load vars for managed_node2 30583 1726853692.24909: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853692.24919: Calling all_plugins_play to load vars for managed_node2 30583 1726853692.24921: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853692.24924: Calling groups_plugins_play to load vars for managed_node2 30583 1726853692.26252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853692.27898: done with get_vars() 30583 1726853692.27921: variable 'ansible_search_path' from source: unknown 30583 1726853692.27923: variable 'ansible_search_path' from source: unknown 30583 1726853692.27938: variable 'item' from source: include params 30583 1726853692.28062: variable 'item' from source: include params 30583 1726853692.28098: we have included files to process 30583 1726853692.28099: generating all_blocks data 30583 1726853692.28101: done generating all_blocks data 30583 1726853692.28102: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853692.28103: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853692.28105: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853692.28296: done processing included file 30583 1726853692.28298: iterating over new_blocks loaded from include file 30583 1726853692.28300: in VariableManager get_vars() 30583 1726853692.28316: done with get_vars() 30583 1726853692.28317: filtering new block on tags 30583 1726853692.28342: done filtering new block on tags 30583 1726853692.28345: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30583 1726853692.28350: extending task lists for all hosts with included blocks 30583 1726853692.28515: done extending task lists 30583 1726853692.28517: done processing included files 30583 1726853692.28517: results queue empty 30583 1726853692.28518: checking for any_errors_fatal 30583 1726853692.28521: done checking for any_errors_fatal 30583 1726853692.28522: checking for max_fail_percentage 30583 1726853692.28523: done checking for max_fail_percentage 30583 1726853692.28524: checking to see if all hosts have failed and the running result is not ok 30583 1726853692.28525: done checking to see if all hosts have failed 30583 1726853692.28525: getting the remaining hosts for this loop 30583 1726853692.28527: done getting the remaining hosts for this loop 30583 1726853692.28529: getting the next task for host managed_node2 30583 1726853692.28534: done getting next task for host managed_node2 30583 1726853692.28536: ^ task is: TASK: Get stat for interface {{ interface }} 30583 1726853692.28539: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853692.28542: getting variables 30583 1726853692.28542: in VariableManager get_vars() 30583 1726853692.28552: Calling all_inventory to load vars for managed_node2 30583 1726853692.28554: Calling groups_inventory to load vars for managed_node2 30583 1726853692.28559: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853692.28565: Calling all_plugins_play to load vars for managed_node2 30583 1726853692.28567: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853692.28570: Calling groups_plugins_play to load vars for managed_node2 30583 1726853692.29852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853692.31451: done with get_vars() 30583 1726853692.31485: done getting variables 30583 1726853692.31633: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:34:52 -0400 (0:00:00.089) 0:00:27.653 ****** 30583 1726853692.31673: entering _queue_task() for managed_node2/stat 30583 1726853692.32064: worker is 1 (out of 1 available) 30583 1726853692.32278: exiting _queue_task() for managed_node2/stat 30583 1726853692.32288: done queuing things up, now waiting for results queue to drain 30583 1726853692.32290: waiting for pending results... 30583 1726853692.32407: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30583 1726853692.32545: in run() - task 02083763-bbaf-05ea-abc5-000000000928 30583 1726853692.32560: variable 'ansible_search_path' from source: unknown 30583 1726853692.32564: variable 'ansible_search_path' from source: unknown 30583 1726853692.32601: calling self._execute() 30583 1726853692.32698: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.32702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.32712: variable 'omit' from source: magic vars 30583 1726853692.33125: variable 'ansible_distribution_major_version' from source: facts 30583 1726853692.33136: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853692.33141: variable 'omit' from source: magic vars 30583 1726853692.33202: variable 'omit' from source: magic vars 30583 1726853692.33339: variable 'interface' from source: play vars 30583 1726853692.33343: variable 'omit' from source: magic vars 30583 1726853692.33355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853692.33585: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853692.33588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853692.33591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853692.33593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853692.33595: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853692.33597: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.33599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.33776: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853692.33779: Set connection var ansible_timeout to 10 30583 1726853692.33782: Set connection var ansible_connection to ssh 30583 1726853692.33785: Set connection var ansible_shell_executable to /bin/sh 30583 1726853692.33788: Set connection var ansible_shell_type to sh 30583 1726853692.33790: Set connection var ansible_pipelining to False 30583 1726853692.33793: variable 'ansible_shell_executable' from source: unknown 30583 1726853692.33795: variable 'ansible_connection' from source: unknown 30583 1726853692.33798: variable 'ansible_module_compression' from source: unknown 30583 1726853692.33800: variable 'ansible_shell_type' from source: unknown 30583 1726853692.33802: variable 'ansible_shell_executable' from source: unknown 30583 1726853692.33805: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.33807: variable 'ansible_pipelining' from source: unknown 30583 1726853692.33810: variable 'ansible_timeout' from source: unknown 30583 1726853692.33812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.33864: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853692.33876: variable 'omit' from source: magic vars 30583 1726853692.33883: starting attempt loop 30583 1726853692.33886: running the handler 30583 1726853692.33899: _low_level_execute_command(): starting 30583 1726853692.33906: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853692.34711: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853692.34768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853692.34786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853692.34810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853692.34915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853692.36673: stdout chunk (state=3): >>>/root <<< 30583 1726853692.36989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853692.36993: stdout chunk (state=3): >>><<< 30583 1726853692.36995: stderr chunk (state=3): >>><<< 30583 1726853692.37000: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853692.37003: _low_level_execute_command(): starting 30583 1726853692.37005: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352 `" && echo ansible-tmp-1726853692.3684404-31823-11119397473352="` echo /root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352 `" ) && sleep 0' 30583 1726853692.37492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853692.37502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853692.37514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853692.37528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853692.37541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853692.37548: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853692.37583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853692.37586: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853692.37589: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853692.37591: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853692.37638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853692.37641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853692.37644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853692.37646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853692.37648: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853692.37650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853692.37732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853692.37735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853692.37760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853692.37870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853692.39880: stdout chunk (state=3): >>>ansible-tmp-1726853692.3684404-31823-11119397473352=/root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352 <<< 30583 1726853692.39981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853692.40020: stderr chunk (state=3): >>><<< 30583 1726853692.40023: stdout chunk (state=3): >>><<< 30583 1726853692.40035: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853692.3684404-31823-11119397473352=/root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853692.40086: variable 'ansible_module_compression' from source: unknown 30583 1726853692.40132: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853692.40170: variable 'ansible_facts' from source: unknown 30583 1726853692.40221: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352/AnsiballZ_stat.py 30583 1726853692.40332: Sending initial data 30583 1726853692.40336: Sent initial data (152 bytes) 30583 1726853692.40931: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853692.40944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853692.40965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853692.41008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853692.41110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853692.42769: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853692.42781: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853692.42836: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853692.42909: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpbii84uj8 /root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352/AnsiballZ_stat.py <<< 30583 1726853692.42913: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352/AnsiballZ_stat.py" <<< 30583 1726853692.42986: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpbii84uj8" to remote "/root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352/AnsiballZ_stat.py" <<< 30583 1726853692.42988: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352/AnsiballZ_stat.py" <<< 30583 1726853692.43776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853692.43779: stderr chunk (state=3): >>><<< 30583 1726853692.43780: stdout chunk (state=3): >>><<< 30583 1726853692.43794: done transferring module to remote 30583 1726853692.43804: _low_level_execute_command(): starting 30583 1726853692.43811: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352/ /root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352/AnsiballZ_stat.py && sleep 0' 30583 1726853692.44461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853692.44470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853692.44536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853692.44583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853692.44594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853692.44612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853692.44717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853692.46626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853692.46638: stderr chunk (state=3): >>><<< 30583 1726853692.46641: stdout chunk (state=3): >>><<< 30583 1726853692.46662: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853692.46666: _low_level_execute_command(): starting 30583 1726853692.46668: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352/AnsiballZ_stat.py && sleep 0' 30583 1726853692.47231: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853692.47259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853692.47262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853692.47266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853692.47269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853692.47285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853692.47356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853692.47359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853692.47469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853692.63309: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853692.64709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853692.64714: stderr chunk (state=3): >>><<< 30583 1726853692.64787: stdout chunk (state=3): >>><<< 30583 1726853692.64792: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853692.64796: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853692.64798: _low_level_execute_command(): starting 30583 1726853692.64800: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853692.3684404-31823-11119397473352/ > /dev/null 2>&1 && sleep 0' 30583 1726853692.65434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853692.65443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853692.65461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853692.65481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853692.65500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853692.65519: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853692.65522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853692.65543: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853692.65557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853692.65594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853692.65602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853692.65639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853692.65646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853692.65731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853692.67876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853692.67880: stdout chunk (state=3): >>><<< 30583 1726853692.67882: stderr chunk (state=3): >>><<< 30583 1726853692.67885: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853692.67886: handler run complete 30583 1726853692.67888: attempt loop complete, returning result 30583 1726853692.67890: _execute() done 30583 1726853692.67892: dumping result to json 30583 1726853692.67893: done dumping result, returning 30583 1726853692.67895: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [02083763-bbaf-05ea-abc5-000000000928] 30583 1726853692.67897: sending task result for task 02083763-bbaf-05ea-abc5-000000000928 30583 1726853692.67970: done sending task result for task 02083763-bbaf-05ea-abc5-000000000928 30583 1726853692.67976: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30583 1726853692.68093: no more pending results, returning what we have 30583 1726853692.68099: results queue empty 30583 1726853692.68100: checking for any_errors_fatal 30583 1726853692.68102: done checking for any_errors_fatal 30583 1726853692.68103: checking for max_fail_percentage 30583 1726853692.68105: done checking for max_fail_percentage 30583 1726853692.68106: checking to see if all hosts have failed and the running result is not ok 30583 1726853692.68107: done checking to see if all hosts have failed 30583 1726853692.68107: getting the remaining hosts for this loop 30583 1726853692.68110: done getting the remaining hosts for this loop 30583 1726853692.68114: getting the next task for host managed_node2 30583 1726853692.68125: done getting next task for host managed_node2 30583 1726853692.68127: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30583 1726853692.68133: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853692.68139: getting variables 30583 1726853692.68141: in VariableManager get_vars() 30583 1726853692.68261: Calling all_inventory to load vars for managed_node2 30583 1726853692.68265: Calling groups_inventory to load vars for managed_node2 30583 1726853692.68268: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853692.68291: Calling all_plugins_play to load vars for managed_node2 30583 1726853692.68294: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853692.68297: Calling groups_plugins_play to load vars for managed_node2 30583 1726853692.69536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853692.70693: done with get_vars() 30583 1726853692.70713: done getting variables 30583 1726853692.70770: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853692.70904: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:34:52 -0400 (0:00:00.392) 0:00:28.046 ****** 30583 1726853692.70940: entering _queue_task() for managed_node2/assert 30583 1726853692.71684: worker is 1 (out of 1 available) 30583 1726853692.71696: exiting _queue_task() for managed_node2/assert 30583 1726853692.71706: done queuing things up, now waiting for results queue to drain 30583 1726853692.71707: waiting for pending results... 30583 1726853692.72191: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 30583 1726853692.72281: in run() - task 02083763-bbaf-05ea-abc5-0000000008a9 30583 1726853692.72302: variable 'ansible_search_path' from source: unknown 30583 1726853692.72308: variable 'ansible_search_path' from source: unknown 30583 1726853692.72346: calling self._execute() 30583 1726853692.72440: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.72451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.72465: variable 'omit' from source: magic vars 30583 1726853692.72844: variable 'ansible_distribution_major_version' from source: facts 30583 1726853692.72939: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853692.72943: variable 'omit' from source: magic vars 30583 1726853692.72946: variable 'omit' from source: magic vars 30583 1726853692.73018: variable 'interface' from source: play vars 30583 1726853692.73050: variable 'omit' from source: magic vars 30583 1726853692.73096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853692.73135: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853692.73166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853692.73190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853692.73205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853692.73239: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853692.73248: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.73257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.73362: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853692.73380: Set connection var ansible_timeout to 10 30583 1726853692.73478: Set connection var ansible_connection to ssh 30583 1726853692.73481: Set connection var ansible_shell_executable to /bin/sh 30583 1726853692.73484: Set connection var ansible_shell_type to sh 30583 1726853692.73485: Set connection var ansible_pipelining to False 30583 1726853692.73487: variable 'ansible_shell_executable' from source: unknown 30583 1726853692.73490: variable 'ansible_connection' from source: unknown 30583 1726853692.73492: variable 'ansible_module_compression' from source: unknown 30583 1726853692.73494: variable 'ansible_shell_type' from source: unknown 30583 1726853692.73496: variable 'ansible_shell_executable' from source: unknown 30583 1726853692.73501: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.73503: variable 'ansible_pipelining' from source: unknown 30583 1726853692.73505: variable 'ansible_timeout' from source: unknown 30583 1726853692.73507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.73950: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853692.73966: variable 'omit' from source: magic vars 30583 1726853692.73977: starting attempt loop 30583 1726853692.73983: running the handler 30583 1726853692.74251: variable 'interface_stat' from source: set_fact 30583 1726853692.74373: Evaluated conditional (not interface_stat.stat.exists): True 30583 1726853692.74385: handler run complete 30583 1726853692.74404: attempt loop complete, returning result 30583 1726853692.74411: _execute() done 30583 1726853692.74416: dumping result to json 30583 1726853692.74423: done dumping result, returning 30583 1726853692.74433: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [02083763-bbaf-05ea-abc5-0000000008a9] 30583 1726853692.74478: sending task result for task 02083763-bbaf-05ea-abc5-0000000008a9 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853692.74636: no more pending results, returning what we have 30583 1726853692.74642: results queue empty 30583 1726853692.74644: checking for any_errors_fatal 30583 1726853692.74657: done checking for any_errors_fatal 30583 1726853692.74658: checking for max_fail_percentage 30583 1726853692.74660: done checking for max_fail_percentage 30583 1726853692.74661: checking to see if all hosts have failed and the running result is not ok 30583 1726853692.74662: done checking to see if all hosts have failed 30583 1726853692.74662: getting the remaining hosts for this loop 30583 1726853692.74664: done getting the remaining hosts for this loop 30583 1726853692.74669: getting the next task for host managed_node2 30583 1726853692.74681: done getting next task for host managed_node2 30583 1726853692.74684: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30583 1726853692.74689: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853692.74695: getting variables 30583 1726853692.74697: in VariableManager get_vars() 30583 1726853692.74731: Calling all_inventory to load vars for managed_node2 30583 1726853692.74734: Calling groups_inventory to load vars for managed_node2 30583 1726853692.74738: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853692.74749: Calling all_plugins_play to load vars for managed_node2 30583 1726853692.74751: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853692.74754: Calling groups_plugins_play to load vars for managed_node2 30583 1726853692.75287: done sending task result for task 02083763-bbaf-05ea-abc5-0000000008a9 30583 1726853692.75291: WORKER PROCESS EXITING 30583 1726853692.76621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853692.78766: done with get_vars() 30583 1726853692.78797: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:34:52 -0400 (0:00:00.079) 0:00:28.126 ****** 30583 1726853692.78896: entering _queue_task() for managed_node2/include_tasks 30583 1726853692.79257: worker is 1 (out of 1 available) 30583 1726853692.79473: exiting _queue_task() for managed_node2/include_tasks 30583 1726853692.79484: done queuing things up, now waiting for results queue to drain 30583 1726853692.79485: waiting for pending results... 30583 1726853692.79599: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 30583 1726853692.79896: in run() - task 02083763-bbaf-05ea-abc5-0000000008ad 30583 1726853692.79900: variable 'ansible_search_path' from source: unknown 30583 1726853692.79909: variable 'ansible_search_path' from source: unknown 30583 1726853692.79912: calling self._execute() 30583 1726853692.79915: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.79918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.79920: variable 'omit' from source: magic vars 30583 1726853692.80831: variable 'ansible_distribution_major_version' from source: facts 30583 1726853692.80835: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853692.80838: _execute() done 30583 1726853692.80840: dumping result to json 30583 1726853692.80842: done dumping result, returning 30583 1726853692.80844: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-05ea-abc5-0000000008ad] 30583 1726853692.80847: sending task result for task 02083763-bbaf-05ea-abc5-0000000008ad 30583 1726853692.81116: done sending task result for task 02083763-bbaf-05ea-abc5-0000000008ad 30583 1726853692.81119: WORKER PROCESS EXITING 30583 1726853692.81154: no more pending results, returning what we have 30583 1726853692.81164: in VariableManager get_vars() 30583 1726853692.81212: Calling all_inventory to load vars for managed_node2 30583 1726853692.81217: Calling groups_inventory to load vars for managed_node2 30583 1726853692.81221: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853692.81237: Calling all_plugins_play to load vars for managed_node2 30583 1726853692.81241: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853692.81245: Calling groups_plugins_play to load vars for managed_node2 30583 1726853692.83108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853692.84764: done with get_vars() 30583 1726853692.84789: variable 'ansible_search_path' from source: unknown 30583 1726853692.84790: variable 'ansible_search_path' from source: unknown 30583 1726853692.84801: variable 'item' from source: include params 30583 1726853692.84905: variable 'item' from source: include params 30583 1726853692.84940: we have included files to process 30583 1726853692.84942: generating all_blocks data 30583 1726853692.84943: done generating all_blocks data 30583 1726853692.84947: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853692.84948: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853692.84950: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853692.85977: done processing included file 30583 1726853692.85979: iterating over new_blocks loaded from include file 30583 1726853692.85981: in VariableManager get_vars() 30583 1726853692.85998: done with get_vars() 30583 1726853692.86000: filtering new block on tags 30583 1726853692.86085: done filtering new block on tags 30583 1726853692.86088: in VariableManager get_vars() 30583 1726853692.86104: done with get_vars() 30583 1726853692.86106: filtering new block on tags 30583 1726853692.86168: done filtering new block on tags 30583 1726853692.86172: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 30583 1726853692.86182: extending task lists for all hosts with included blocks 30583 1726853692.86549: done extending task lists 30583 1726853692.86551: done processing included files 30583 1726853692.86551: results queue empty 30583 1726853692.86552: checking for any_errors_fatal 30583 1726853692.86555: done checking for any_errors_fatal 30583 1726853692.86556: checking for max_fail_percentage 30583 1726853692.86557: done checking for max_fail_percentage 30583 1726853692.86558: checking to see if all hosts have failed and the running result is not ok 30583 1726853692.86559: done checking to see if all hosts have failed 30583 1726853692.86559: getting the remaining hosts for this loop 30583 1726853692.86561: done getting the remaining hosts for this loop 30583 1726853692.86563: getting the next task for host managed_node2 30583 1726853692.86568: done getting next task for host managed_node2 30583 1726853692.86570: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30583 1726853692.86576: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853692.86578: getting variables 30583 1726853692.86579: in VariableManager get_vars() 30583 1726853692.86588: Calling all_inventory to load vars for managed_node2 30583 1726853692.86590: Calling groups_inventory to load vars for managed_node2 30583 1726853692.86592: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853692.86598: Calling all_plugins_play to load vars for managed_node2 30583 1726853692.86600: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853692.86603: Calling groups_plugins_play to load vars for managed_node2 30583 1726853692.87760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853692.89328: done with get_vars() 30583 1726853692.89352: done getting variables 30583 1726853692.89400: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:34:52 -0400 (0:00:00.105) 0:00:28.231 ****** 30583 1726853692.89433: entering _queue_task() for managed_node2/set_fact 30583 1726853692.89789: worker is 1 (out of 1 available) 30583 1726853692.89801: exiting _queue_task() for managed_node2/set_fact 30583 1726853692.89813: done queuing things up, now waiting for results queue to drain 30583 1726853692.89814: waiting for pending results... 30583 1726853692.90194: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 30583 1726853692.90239: in run() - task 02083763-bbaf-05ea-abc5-000000000946 30583 1726853692.90254: variable 'ansible_search_path' from source: unknown 30583 1726853692.90260: variable 'ansible_search_path' from source: unknown 30583 1726853692.90295: calling self._execute() 30583 1726853692.90403: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.90409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.90413: variable 'omit' from source: magic vars 30583 1726853692.90795: variable 'ansible_distribution_major_version' from source: facts 30583 1726853692.90799: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853692.90802: variable 'omit' from source: magic vars 30583 1726853692.90853: variable 'omit' from source: magic vars 30583 1726853692.90894: variable 'omit' from source: magic vars 30583 1726853692.90939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853692.91076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853692.91080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853692.91083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853692.91086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853692.91097: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853692.91100: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.91105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.91218: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853692.91224: Set connection var ansible_timeout to 10 30583 1726853692.91227: Set connection var ansible_connection to ssh 30583 1726853692.91231: Set connection var ansible_shell_executable to /bin/sh 30583 1726853692.91234: Set connection var ansible_shell_type to sh 30583 1726853692.91244: Set connection var ansible_pipelining to False 30583 1726853692.91269: variable 'ansible_shell_executable' from source: unknown 30583 1726853692.91273: variable 'ansible_connection' from source: unknown 30583 1726853692.91277: variable 'ansible_module_compression' from source: unknown 30583 1726853692.91307: variable 'ansible_shell_type' from source: unknown 30583 1726853692.91312: variable 'ansible_shell_executable' from source: unknown 30583 1726853692.91315: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.91317: variable 'ansible_pipelining' from source: unknown 30583 1726853692.91319: variable 'ansible_timeout' from source: unknown 30583 1726853692.91321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.91601: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853692.91609: variable 'omit' from source: magic vars 30583 1726853692.91612: starting attempt loop 30583 1726853692.91614: running the handler 30583 1726853692.91615: handler run complete 30583 1726853692.91620: attempt loop complete, returning result 30583 1726853692.91622: _execute() done 30583 1726853692.91624: dumping result to json 30583 1726853692.91629: done dumping result, returning 30583 1726853692.91631: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-05ea-abc5-000000000946] 30583 1726853692.91679: sending task result for task 02083763-bbaf-05ea-abc5-000000000946 30583 1726853692.91903: done sending task result for task 02083763-bbaf-05ea-abc5-000000000946 30583 1726853692.91907: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30583 1726853692.91960: no more pending results, returning what we have 30583 1726853692.91964: results queue empty 30583 1726853692.91965: checking for any_errors_fatal 30583 1726853692.91966: done checking for any_errors_fatal 30583 1726853692.91967: checking for max_fail_percentage 30583 1726853692.91968: done checking for max_fail_percentage 30583 1726853692.91969: checking to see if all hosts have failed and the running result is not ok 30583 1726853692.91973: done checking to see if all hosts have failed 30583 1726853692.91974: getting the remaining hosts for this loop 30583 1726853692.91976: done getting the remaining hosts for this loop 30583 1726853692.91979: getting the next task for host managed_node2 30583 1726853692.91987: done getting next task for host managed_node2 30583 1726853692.91989: ^ task is: TASK: Stat profile file 30583 1726853692.91994: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853692.91998: getting variables 30583 1726853692.92000: in VariableManager get_vars() 30583 1726853692.92031: Calling all_inventory to load vars for managed_node2 30583 1726853692.92034: Calling groups_inventory to load vars for managed_node2 30583 1726853692.92037: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853692.92047: Calling all_plugins_play to load vars for managed_node2 30583 1726853692.92052: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853692.92055: Calling groups_plugins_play to load vars for managed_node2 30583 1726853692.94293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853692.95694: done with get_vars() 30583 1726853692.95713: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:34:52 -0400 (0:00:00.063) 0:00:28.295 ****** 30583 1726853692.95804: entering _queue_task() for managed_node2/stat 30583 1726853692.96090: worker is 1 (out of 1 available) 30583 1726853692.96105: exiting _queue_task() for managed_node2/stat 30583 1726853692.96117: done queuing things up, now waiting for results queue to drain 30583 1726853692.96118: waiting for pending results... 30583 1726853692.96364: running TaskExecutor() for managed_node2/TASK: Stat profile file 30583 1726853692.96413: in run() - task 02083763-bbaf-05ea-abc5-000000000947 30583 1726853692.96425: variable 'ansible_search_path' from source: unknown 30583 1726853692.96429: variable 'ansible_search_path' from source: unknown 30583 1726853692.96459: calling self._execute() 30583 1726853692.96534: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.96538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.96546: variable 'omit' from source: magic vars 30583 1726853692.97176: variable 'ansible_distribution_major_version' from source: facts 30583 1726853692.97180: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853692.97182: variable 'omit' from source: magic vars 30583 1726853692.97186: variable 'omit' from source: magic vars 30583 1726853692.97188: variable 'profile' from source: play vars 30583 1726853692.97191: variable 'interface' from source: play vars 30583 1726853692.97264: variable 'interface' from source: play vars 30583 1726853692.97293: variable 'omit' from source: magic vars 30583 1726853692.97352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853692.97396: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853692.97430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853692.97453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853692.97474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853692.97507: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853692.97514: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.97531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.97638: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853692.97750: Set connection var ansible_timeout to 10 30583 1726853692.97756: Set connection var ansible_connection to ssh 30583 1726853692.97759: Set connection var ansible_shell_executable to /bin/sh 30583 1726853692.97761: Set connection var ansible_shell_type to sh 30583 1726853692.97764: Set connection var ansible_pipelining to False 30583 1726853692.97766: variable 'ansible_shell_executable' from source: unknown 30583 1726853692.97768: variable 'ansible_connection' from source: unknown 30583 1726853692.97770: variable 'ansible_module_compression' from source: unknown 30583 1726853692.97774: variable 'ansible_shell_type' from source: unknown 30583 1726853692.97777: variable 'ansible_shell_executable' from source: unknown 30583 1726853692.97778: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853692.97780: variable 'ansible_pipelining' from source: unknown 30583 1726853692.97782: variable 'ansible_timeout' from source: unknown 30583 1726853692.97784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853692.98031: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853692.98056: variable 'omit' from source: magic vars 30583 1726853692.98080: starting attempt loop 30583 1726853692.98087: running the handler 30583 1726853692.98112: _low_level_execute_command(): starting 30583 1726853692.98127: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853692.98844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853692.98902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853692.98927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853692.99014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853693.00835: stdout chunk (state=3): >>>/root <<< 30583 1726853693.00921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853693.00929: stdout chunk (state=3): >>><<< 30583 1726853693.00932: stderr chunk (state=3): >>><<< 30583 1726853693.00935: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853693.01028: _low_level_execute_command(): starting 30583 1726853693.01034: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778 `" && echo ansible-tmp-1726853693.0093253-31861-273959125190778="` echo /root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778 `" ) && sleep 0' 30583 1726853693.01649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853693.01659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853693.01699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853693.01703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853693.01721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853693.01733: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853693.01735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853693.01909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853693.01913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853693.01916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853693.01923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853693.01999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853693.03980: stdout chunk (state=3): >>>ansible-tmp-1726853693.0093253-31861-273959125190778=/root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778 <<< 30583 1726853693.04083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853693.04155: stderr chunk (state=3): >>><<< 30583 1726853693.04159: stdout chunk (state=3): >>><<< 30583 1726853693.04339: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853693.0093253-31861-273959125190778=/root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853693.04343: variable 'ansible_module_compression' from source: unknown 30583 1726853693.04345: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853693.04347: variable 'ansible_facts' from source: unknown 30583 1726853693.04446: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778/AnsiballZ_stat.py 30583 1726853693.04596: Sending initial data 30583 1726853693.04732: Sent initial data (153 bytes) 30583 1726853693.05680: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853693.05694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853693.05697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853693.05741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853693.05861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853693.07520: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30583 1726853693.07524: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853693.07594: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853693.07663: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpcwne3i3u /root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778/AnsiballZ_stat.py <<< 30583 1726853693.07670: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778/AnsiballZ_stat.py" <<< 30583 1726853693.07753: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpcwne3i3u" to remote "/root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778/AnsiballZ_stat.py" <<< 30583 1726853693.08878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853693.08881: stderr chunk (state=3): >>><<< 30583 1726853693.08884: stdout chunk (state=3): >>><<< 30583 1726853693.08900: done transferring module to remote 30583 1726853693.08914: _low_level_execute_command(): starting 30583 1726853693.08918: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778/ /root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778/AnsiballZ_stat.py && sleep 0' 30583 1726853693.09463: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853693.09473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853693.09499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853693.09502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853693.09509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853693.09595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853693.09598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853693.09711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853693.11587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853693.11612: stderr chunk (state=3): >>><<< 30583 1726853693.11616: stdout chunk (state=3): >>><<< 30583 1726853693.11630: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853693.11633: _low_level_execute_command(): starting 30583 1726853693.11636: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778/AnsiballZ_stat.py && sleep 0' 30583 1726853693.12202: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853693.12205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853693.12208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853693.12210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853693.12212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853693.12268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853693.12281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853693.12348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853693.28044: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853693.29479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853693.29483: stdout chunk (state=3): >>><<< 30583 1726853693.29486: stderr chunk (state=3): >>><<< 30583 1726853693.29617: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853693.29622: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853693.29625: _low_level_execute_command(): starting 30583 1726853693.29627: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853693.0093253-31861-273959125190778/ > /dev/null 2>&1 && sleep 0' 30583 1726853693.30191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853693.30209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853693.30289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853693.30336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853693.30352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853693.30378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853693.30647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853693.32623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853693.32637: stderr chunk (state=3): >>><<< 30583 1726853693.32646: stdout chunk (state=3): >>><<< 30583 1726853693.32668: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853693.32703: handler run complete 30583 1726853693.32878: attempt loop complete, returning result 30583 1726853693.32881: _execute() done 30583 1726853693.32883: dumping result to json 30583 1726853693.32885: done dumping result, returning 30583 1726853693.32887: done running TaskExecutor() for managed_node2/TASK: Stat profile file [02083763-bbaf-05ea-abc5-000000000947] 30583 1726853693.32889: sending task result for task 02083763-bbaf-05ea-abc5-000000000947 30583 1726853693.32962: done sending task result for task 02083763-bbaf-05ea-abc5-000000000947 ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30583 1726853693.33027: no more pending results, returning what we have 30583 1726853693.33030: results queue empty 30583 1726853693.33033: checking for any_errors_fatal 30583 1726853693.33043: done checking for any_errors_fatal 30583 1726853693.33044: checking for max_fail_percentage 30583 1726853693.33046: done checking for max_fail_percentage 30583 1726853693.33047: checking to see if all hosts have failed and the running result is not ok 30583 1726853693.33048: done checking to see if all hosts have failed 30583 1726853693.33049: getting the remaining hosts for this loop 30583 1726853693.33051: done getting the remaining hosts for this loop 30583 1726853693.33055: getting the next task for host managed_node2 30583 1726853693.33064: done getting next task for host managed_node2 30583 1726853693.33067: ^ task is: TASK: Set NM profile exist flag based on the profile files 30583 1726853693.33074: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853693.33079: getting variables 30583 1726853693.33081: in VariableManager get_vars() 30583 1726853693.33116: Calling all_inventory to load vars for managed_node2 30583 1726853693.33119: Calling groups_inventory to load vars for managed_node2 30583 1726853693.33123: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853693.33135: Calling all_plugins_play to load vars for managed_node2 30583 1726853693.33139: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853693.33142: Calling groups_plugins_play to load vars for managed_node2 30583 1726853693.33935: WORKER PROCESS EXITING 30583 1726853693.36306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853693.39457: done with get_vars() 30583 1726853693.39481: done getting variables 30583 1726853693.39625: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:34:53 -0400 (0:00:00.438) 0:00:28.733 ****** 30583 1726853693.39660: entering _queue_task() for managed_node2/set_fact 30583 1726853693.40035: worker is 1 (out of 1 available) 30583 1726853693.40165: exiting _queue_task() for managed_node2/set_fact 30583 1726853693.40180: done queuing things up, now waiting for results queue to drain 30583 1726853693.40181: waiting for pending results... 30583 1726853693.40392: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 30583 1726853693.40574: in run() - task 02083763-bbaf-05ea-abc5-000000000948 30583 1726853693.40601: variable 'ansible_search_path' from source: unknown 30583 1726853693.40610: variable 'ansible_search_path' from source: unknown 30583 1726853693.40932: calling self._execute() 30583 1726853693.41161: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853693.41175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853693.41190: variable 'omit' from source: magic vars 30583 1726853693.41572: variable 'ansible_distribution_major_version' from source: facts 30583 1726853693.41593: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853693.41721: variable 'profile_stat' from source: set_fact 30583 1726853693.41736: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853693.41775: when evaluation is False, skipping this task 30583 1726853693.41778: _execute() done 30583 1726853693.41785: dumping result to json 30583 1726853693.41788: done dumping result, returning 30583 1726853693.41793: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-05ea-abc5-000000000948] 30583 1726853693.41798: sending task result for task 02083763-bbaf-05ea-abc5-000000000948 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853693.42119: no more pending results, returning what we have 30583 1726853693.42123: results queue empty 30583 1726853693.42124: checking for any_errors_fatal 30583 1726853693.42131: done checking for any_errors_fatal 30583 1726853693.42132: checking for max_fail_percentage 30583 1726853693.42134: done checking for max_fail_percentage 30583 1726853693.42135: checking to see if all hosts have failed and the running result is not ok 30583 1726853693.42136: done checking to see if all hosts have failed 30583 1726853693.42137: getting the remaining hosts for this loop 30583 1726853693.42139: done getting the remaining hosts for this loop 30583 1726853693.42142: getting the next task for host managed_node2 30583 1726853693.42151: done getting next task for host managed_node2 30583 1726853693.42153: ^ task is: TASK: Get NM profile info 30583 1726853693.42158: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853693.42161: getting variables 30583 1726853693.42163: in VariableManager get_vars() 30583 1726853693.42197: Calling all_inventory to load vars for managed_node2 30583 1726853693.42199: Calling groups_inventory to load vars for managed_node2 30583 1726853693.42203: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853693.42214: Calling all_plugins_play to load vars for managed_node2 30583 1726853693.42217: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853693.42220: Calling groups_plugins_play to load vars for managed_node2 30583 1726853693.42786: done sending task result for task 02083763-bbaf-05ea-abc5-000000000948 30583 1726853693.42789: WORKER PROCESS EXITING 30583 1726853693.43738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853693.46064: done with get_vars() 30583 1726853693.46101: done getting variables 30583 1726853693.46204: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:34:53 -0400 (0:00:00.065) 0:00:28.799 ****** 30583 1726853693.46240: entering _queue_task() for managed_node2/shell 30583 1726853693.46702: worker is 1 (out of 1 available) 30583 1726853693.46713: exiting _queue_task() for managed_node2/shell 30583 1726853693.46838: done queuing things up, now waiting for results queue to drain 30583 1726853693.46840: waiting for pending results... 30583 1726853693.47093: running TaskExecutor() for managed_node2/TASK: Get NM profile info 30583 1726853693.47229: in run() - task 02083763-bbaf-05ea-abc5-000000000949 30583 1726853693.47249: variable 'ansible_search_path' from source: unknown 30583 1726853693.47258: variable 'ansible_search_path' from source: unknown 30583 1726853693.47303: calling self._execute() 30583 1726853693.47399: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853693.47409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853693.47424: variable 'omit' from source: magic vars 30583 1726853693.47880: variable 'ansible_distribution_major_version' from source: facts 30583 1726853693.47897: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853693.47907: variable 'omit' from source: magic vars 30583 1726853693.47970: variable 'omit' from source: magic vars 30583 1726853693.48080: variable 'profile' from source: play vars 30583 1726853693.48090: variable 'interface' from source: play vars 30583 1726853693.48160: variable 'interface' from source: play vars 30583 1726853693.48186: variable 'omit' from source: magic vars 30583 1726853693.48230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853693.48280: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853693.48305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853693.48325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853693.48339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853693.48473: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853693.48476: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853693.48478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853693.48503: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853693.48513: Set connection var ansible_timeout to 10 30583 1726853693.48520: Set connection var ansible_connection to ssh 30583 1726853693.48528: Set connection var ansible_shell_executable to /bin/sh 30583 1726853693.48534: Set connection var ansible_shell_type to sh 30583 1726853693.48548: Set connection var ansible_pipelining to False 30583 1726853693.48582: variable 'ansible_shell_executable' from source: unknown 30583 1726853693.48589: variable 'ansible_connection' from source: unknown 30583 1726853693.48595: variable 'ansible_module_compression' from source: unknown 30583 1726853693.48601: variable 'ansible_shell_type' from source: unknown 30583 1726853693.48606: variable 'ansible_shell_executable' from source: unknown 30583 1726853693.48611: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853693.48618: variable 'ansible_pipelining' from source: unknown 30583 1726853693.48624: variable 'ansible_timeout' from source: unknown 30583 1726853693.48630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853693.48776: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853693.48799: variable 'omit' from source: magic vars 30583 1726853693.48810: starting attempt loop 30583 1726853693.48816: running the handler 30583 1726853693.48906: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853693.48909: _low_level_execute_command(): starting 30583 1726853693.48911: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853693.49598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853693.49615: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853693.49687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853693.49738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853693.49757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853693.49787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853693.49867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853693.51983: stdout chunk (state=3): >>>/root <<< 30583 1726853693.51987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853693.51990: stdout chunk (state=3): >>><<< 30583 1726853693.51992: stderr chunk (state=3): >>><<< 30583 1726853693.51996: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853693.52000: _low_level_execute_command(): starting 30583 1726853693.52003: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749 `" && echo ansible-tmp-1726853693.5188756-31885-211933126792749="` echo /root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749 `" ) && sleep 0' 30583 1726853693.52547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853693.52570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853693.52689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853693.52721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853693.52859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853693.54947: stdout chunk (state=3): >>>ansible-tmp-1726853693.5188756-31885-211933126792749=/root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749 <<< 30583 1726853693.55007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853693.55022: stdout chunk (state=3): >>><<< 30583 1726853693.55044: stderr chunk (state=3): >>><<< 30583 1726853693.55061: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853693.5188756-31885-211933126792749=/root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853693.55137: variable 'ansible_module_compression' from source: unknown 30583 1726853693.55194: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853693.55247: variable 'ansible_facts' from source: unknown 30583 1726853693.55462: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749/AnsiballZ_command.py 30583 1726853693.55696: Sending initial data 30583 1726853693.55699: Sent initial data (156 bytes) 30583 1726853693.56705: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853693.56776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853693.56925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853693.56929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853693.56931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853693.56996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853693.58715: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30583 1726853693.58741: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853693.58818: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853693.58903: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpgk1rfnsc /root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749/AnsiballZ_command.py <<< 30583 1726853693.58906: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749/AnsiballZ_command.py" <<< 30583 1726853693.58977: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpgk1rfnsc" to remote "/root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749/AnsiballZ_command.py" <<< 30583 1726853693.60643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853693.60732: stderr chunk (state=3): >>><<< 30583 1726853693.60913: stdout chunk (state=3): >>><<< 30583 1726853693.60916: done transferring module to remote 30583 1726853693.60919: _low_level_execute_command(): starting 30583 1726853693.60921: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749/ /root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749/AnsiballZ_command.py && sleep 0' 30583 1726853693.61586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853693.61595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853693.61607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853693.61702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853693.63651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853693.63668: stderr chunk (state=3): >>><<< 30583 1726853693.63772: stdout chunk (state=3): >>><<< 30583 1726853693.63776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853693.63780: _low_level_execute_command(): starting 30583 1726853693.63783: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749/AnsiballZ_command.py && sleep 0' 30583 1726853693.64598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853693.64613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853693.64633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853693.64690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853693.64794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853693.64850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853693.65118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853693.82449: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 13:34:53.806089", "end": "2024-09-20 13:34:53.823159", "delta": "0:00:00.017070", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853693.84265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853693.84430: stderr chunk (state=3): >>><<< 30583 1726853693.84434: stdout chunk (state=3): >>><<< 30583 1726853693.84437: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 13:34:53.806089", "end": "2024-09-20 13:34:53.823159", "delta": "0:00:00.017070", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853693.84440: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853693.84442: _low_level_execute_command(): starting 30583 1726853693.84444: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853693.5188756-31885-211933126792749/ > /dev/null 2>&1 && sleep 0' 30583 1726853693.85446: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853693.85450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853693.85480: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853693.85707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853693.85788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853693.87710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853693.87826: stderr chunk (state=3): >>><<< 30583 1726853693.87830: stdout chunk (state=3): >>><<< 30583 1726853693.87833: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853693.87835: handler run complete 30583 1726853693.87977: Evaluated conditional (False): False 30583 1726853693.87980: attempt loop complete, returning result 30583 1726853693.87982: _execute() done 30583 1726853693.87984: dumping result to json 30583 1726853693.87986: done dumping result, returning 30583 1726853693.87988: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [02083763-bbaf-05ea-abc5-000000000949] 30583 1726853693.87990: sending task result for task 02083763-bbaf-05ea-abc5-000000000949 ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.017070", "end": "2024-09-20 13:34:53.823159", "rc": 0, "start": "2024-09-20 13:34:53.806089" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30583 1726853693.88277: no more pending results, returning what we have 30583 1726853693.88280: results queue empty 30583 1726853693.88282: checking for any_errors_fatal 30583 1726853693.88291: done checking for any_errors_fatal 30583 1726853693.88292: checking for max_fail_percentage 30583 1726853693.88294: done checking for max_fail_percentage 30583 1726853693.88295: checking to see if all hosts have failed and the running result is not ok 30583 1726853693.88296: done checking to see if all hosts have failed 30583 1726853693.88296: getting the remaining hosts for this loop 30583 1726853693.88298: done getting the remaining hosts for this loop 30583 1726853693.88302: getting the next task for host managed_node2 30583 1726853693.88311: done getting next task for host managed_node2 30583 1726853693.88314: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30583 1726853693.88319: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853693.88323: getting variables 30583 1726853693.88325: in VariableManager get_vars() 30583 1726853693.88361: Calling all_inventory to load vars for managed_node2 30583 1726853693.88364: Calling groups_inventory to load vars for managed_node2 30583 1726853693.88368: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853693.88484: Calling all_plugins_play to load vars for managed_node2 30583 1726853693.88488: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853693.88491: Calling groups_plugins_play to load vars for managed_node2 30583 1726853693.89086: done sending task result for task 02083763-bbaf-05ea-abc5-000000000949 30583 1726853693.89089: WORKER PROCESS EXITING 30583 1726853693.91683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853693.94216: done with get_vars() 30583 1726853693.94284: done getting variables 30583 1726853693.94415: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:34:53 -0400 (0:00:00.482) 0:00:29.281 ****** 30583 1726853693.94448: entering _queue_task() for managed_node2/set_fact 30583 1726853693.94897: worker is 1 (out of 1 available) 30583 1726853693.94909: exiting _queue_task() for managed_node2/set_fact 30583 1726853693.94926: done queuing things up, now waiting for results queue to drain 30583 1726853693.94927: waiting for pending results... 30583 1726853693.95204: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30583 1726853693.95314: in run() - task 02083763-bbaf-05ea-abc5-00000000094a 30583 1726853693.95329: variable 'ansible_search_path' from source: unknown 30583 1726853693.95333: variable 'ansible_search_path' from source: unknown 30583 1726853693.95376: calling self._execute() 30583 1726853693.95475: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853693.95520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853693.95523: variable 'omit' from source: magic vars 30583 1726853693.96333: variable 'ansible_distribution_major_version' from source: facts 30583 1726853693.96346: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853693.96680: variable 'nm_profile_exists' from source: set_fact 30583 1726853693.96692: Evaluated conditional (nm_profile_exists.rc == 0): True 30583 1726853693.96700: variable 'omit' from source: magic vars 30583 1726853693.96936: variable 'omit' from source: magic vars 30583 1726853693.96940: variable 'omit' from source: magic vars 30583 1726853693.97306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853693.97310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853693.97312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853693.97314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853693.97317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853693.97319: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853693.97321: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853693.97323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853693.97417: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853693.97453: Set connection var ansible_timeout to 10 30583 1726853693.97459: Set connection var ansible_connection to ssh 30583 1726853693.97461: Set connection var ansible_shell_executable to /bin/sh 30583 1726853693.97464: Set connection var ansible_shell_type to sh 30583 1726853693.97481: Set connection var ansible_pipelining to False 30583 1726853693.97522: variable 'ansible_shell_executable' from source: unknown 30583 1726853693.97525: variable 'ansible_connection' from source: unknown 30583 1726853693.97532: variable 'ansible_module_compression' from source: unknown 30583 1726853693.97534: variable 'ansible_shell_type' from source: unknown 30583 1726853693.97537: variable 'ansible_shell_executable' from source: unknown 30583 1726853693.97539: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853693.97545: variable 'ansible_pipelining' from source: unknown 30583 1726853693.97547: variable 'ansible_timeout' from source: unknown 30583 1726853693.97551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853693.97697: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853693.97711: variable 'omit' from source: magic vars 30583 1726853693.97718: starting attempt loop 30583 1726853693.97721: running the handler 30583 1726853693.97734: handler run complete 30583 1726853693.97749: attempt loop complete, returning result 30583 1726853693.97752: _execute() done 30583 1726853693.97757: dumping result to json 30583 1726853693.97760: done dumping result, returning 30583 1726853693.97876: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-05ea-abc5-00000000094a] 30583 1726853693.97879: sending task result for task 02083763-bbaf-05ea-abc5-00000000094a 30583 1726853693.97943: done sending task result for task 02083763-bbaf-05ea-abc5-00000000094a 30583 1726853693.97946: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30583 1726853693.98007: no more pending results, returning what we have 30583 1726853693.98011: results queue empty 30583 1726853693.98012: checking for any_errors_fatal 30583 1726853693.98020: done checking for any_errors_fatal 30583 1726853693.98021: checking for max_fail_percentage 30583 1726853693.98024: done checking for max_fail_percentage 30583 1726853693.98025: checking to see if all hosts have failed and the running result is not ok 30583 1726853693.98025: done checking to see if all hosts have failed 30583 1726853693.98026: getting the remaining hosts for this loop 30583 1726853693.98028: done getting the remaining hosts for this loop 30583 1726853693.98032: getting the next task for host managed_node2 30583 1726853693.98045: done getting next task for host managed_node2 30583 1726853693.98047: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30583 1726853693.98053: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853693.98058: getting variables 30583 1726853693.98060: in VariableManager get_vars() 30583 1726853693.98098: Calling all_inventory to load vars for managed_node2 30583 1726853693.98101: Calling groups_inventory to load vars for managed_node2 30583 1726853693.98105: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853693.98117: Calling all_plugins_play to load vars for managed_node2 30583 1726853693.98121: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853693.98125: Calling groups_plugins_play to load vars for managed_node2 30583 1726853693.99904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853694.02482: done with get_vars() 30583 1726853694.02519: done getting variables 30583 1726853694.02587: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853694.02714: variable 'profile' from source: play vars 30583 1726853694.02718: variable 'interface' from source: play vars 30583 1726853694.02778: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:34:54 -0400 (0:00:00.083) 0:00:29.365 ****** 30583 1726853694.02809: entering _queue_task() for managed_node2/command 30583 1726853694.03156: worker is 1 (out of 1 available) 30583 1726853694.03175: exiting _queue_task() for managed_node2/command 30583 1726853694.03188: done queuing things up, now waiting for results queue to drain 30583 1726853694.03189: waiting for pending results... 30583 1726853694.03489: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 30583 1726853694.03730: in run() - task 02083763-bbaf-05ea-abc5-00000000094c 30583 1726853694.03735: variable 'ansible_search_path' from source: unknown 30583 1726853694.03737: variable 'ansible_search_path' from source: unknown 30583 1726853694.03741: calling self._execute() 30583 1726853694.03813: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.03816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.03862: variable 'omit' from source: magic vars 30583 1726853694.04236: variable 'ansible_distribution_major_version' from source: facts 30583 1726853694.04247: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853694.04678: variable 'profile_stat' from source: set_fact 30583 1726853694.04681: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853694.04683: when evaluation is False, skipping this task 30583 1726853694.04685: _execute() done 30583 1726853694.04687: dumping result to json 30583 1726853694.04689: done dumping result, returning 30583 1726853694.04707: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-00000000094c] 30583 1726853694.04710: sending task result for task 02083763-bbaf-05ea-abc5-00000000094c 30583 1726853694.04777: done sending task result for task 02083763-bbaf-05ea-abc5-00000000094c 30583 1726853694.04781: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853694.04842: no more pending results, returning what we have 30583 1726853694.04845: results queue empty 30583 1726853694.04846: checking for any_errors_fatal 30583 1726853694.04851: done checking for any_errors_fatal 30583 1726853694.04851: checking for max_fail_percentage 30583 1726853694.04853: done checking for max_fail_percentage 30583 1726853694.04854: checking to see if all hosts have failed and the running result is not ok 30583 1726853694.04855: done checking to see if all hosts have failed 30583 1726853694.04855: getting the remaining hosts for this loop 30583 1726853694.04857: done getting the remaining hosts for this loop 30583 1726853694.04860: getting the next task for host managed_node2 30583 1726853694.04868: done getting next task for host managed_node2 30583 1726853694.04873: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30583 1726853694.04878: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853694.04881: getting variables 30583 1726853694.04883: in VariableManager get_vars() 30583 1726853694.04910: Calling all_inventory to load vars for managed_node2 30583 1726853694.04913: Calling groups_inventory to load vars for managed_node2 30583 1726853694.04921: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853694.04931: Calling all_plugins_play to load vars for managed_node2 30583 1726853694.04934: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853694.04937: Calling groups_plugins_play to load vars for managed_node2 30583 1726853694.07437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853694.09305: done with get_vars() 30583 1726853694.09338: done getting variables 30583 1726853694.09515: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853694.09735: variable 'profile' from source: play vars 30583 1726853694.09739: variable 'interface' from source: play vars 30583 1726853694.09799: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:34:54 -0400 (0:00:00.070) 0:00:29.435 ****** 30583 1726853694.09835: entering _queue_task() for managed_node2/set_fact 30583 1726853694.10681: worker is 1 (out of 1 available) 30583 1726853694.10695: exiting _queue_task() for managed_node2/set_fact 30583 1726853694.10707: done queuing things up, now waiting for results queue to drain 30583 1726853694.10708: waiting for pending results... 30583 1726853694.10928: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 30583 1726853694.11062: in run() - task 02083763-bbaf-05ea-abc5-00000000094d 30583 1726853694.11085: variable 'ansible_search_path' from source: unknown 30583 1726853694.11092: variable 'ansible_search_path' from source: unknown 30583 1726853694.11132: calling self._execute() 30583 1726853694.11228: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.11241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.11254: variable 'omit' from source: magic vars 30583 1726853694.11624: variable 'ansible_distribution_major_version' from source: facts 30583 1726853694.11635: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853694.11761: variable 'profile_stat' from source: set_fact 30583 1726853694.11776: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853694.11779: when evaluation is False, skipping this task 30583 1726853694.11782: _execute() done 30583 1726853694.11784: dumping result to json 30583 1726853694.11786: done dumping result, returning 30583 1726853694.11807: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-00000000094d] 30583 1726853694.11810: sending task result for task 02083763-bbaf-05ea-abc5-00000000094d 30583 1726853694.12034: done sending task result for task 02083763-bbaf-05ea-abc5-00000000094d 30583 1726853694.12037: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853694.12086: no more pending results, returning what we have 30583 1726853694.12090: results queue empty 30583 1726853694.12091: checking for any_errors_fatal 30583 1726853694.12098: done checking for any_errors_fatal 30583 1726853694.12099: checking for max_fail_percentage 30583 1726853694.12101: done checking for max_fail_percentage 30583 1726853694.12102: checking to see if all hosts have failed and the running result is not ok 30583 1726853694.12103: done checking to see if all hosts have failed 30583 1726853694.12104: getting the remaining hosts for this loop 30583 1726853694.12105: done getting the remaining hosts for this loop 30583 1726853694.12108: getting the next task for host managed_node2 30583 1726853694.12116: done getting next task for host managed_node2 30583 1726853694.12118: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30583 1726853694.12123: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853694.12127: getting variables 30583 1726853694.12128: in VariableManager get_vars() 30583 1726853694.12247: Calling all_inventory to load vars for managed_node2 30583 1726853694.12250: Calling groups_inventory to load vars for managed_node2 30583 1726853694.12253: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853694.12267: Calling all_plugins_play to load vars for managed_node2 30583 1726853694.12272: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853694.12276: Calling groups_plugins_play to load vars for managed_node2 30583 1726853694.13649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853694.15219: done with get_vars() 30583 1726853694.15242: done getting variables 30583 1726853694.15301: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853694.15407: variable 'profile' from source: play vars 30583 1726853694.15411: variable 'interface' from source: play vars 30583 1726853694.15464: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:34:54 -0400 (0:00:00.056) 0:00:29.492 ****** 30583 1726853694.15497: entering _queue_task() for managed_node2/command 30583 1726853694.15848: worker is 1 (out of 1 available) 30583 1726853694.15862: exiting _queue_task() for managed_node2/command 30583 1726853694.15981: done queuing things up, now waiting for results queue to drain 30583 1726853694.15983: waiting for pending results... 30583 1726853694.16179: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 30583 1726853694.16294: in run() - task 02083763-bbaf-05ea-abc5-00000000094e 30583 1726853694.16309: variable 'ansible_search_path' from source: unknown 30583 1726853694.16314: variable 'ansible_search_path' from source: unknown 30583 1726853694.16362: calling self._execute() 30583 1726853694.16452: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.16460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.16511: variable 'omit' from source: magic vars 30583 1726853694.17075: variable 'ansible_distribution_major_version' from source: facts 30583 1726853694.17078: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853694.17080: variable 'profile_stat' from source: set_fact 30583 1726853694.17082: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853694.17084: when evaluation is False, skipping this task 30583 1726853694.17085: _execute() done 30583 1726853694.17087: dumping result to json 30583 1726853694.17088: done dumping result, returning 30583 1726853694.17090: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-00000000094e] 30583 1726853694.17092: sending task result for task 02083763-bbaf-05ea-abc5-00000000094e 30583 1726853694.17147: done sending task result for task 02083763-bbaf-05ea-abc5-00000000094e 30583 1726853694.17150: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853694.17222: no more pending results, returning what we have 30583 1726853694.17227: results queue empty 30583 1726853694.17228: checking for any_errors_fatal 30583 1726853694.17234: done checking for any_errors_fatal 30583 1726853694.17234: checking for max_fail_percentage 30583 1726853694.17236: done checking for max_fail_percentage 30583 1726853694.17238: checking to see if all hosts have failed and the running result is not ok 30583 1726853694.17238: done checking to see if all hosts have failed 30583 1726853694.17239: getting the remaining hosts for this loop 30583 1726853694.17241: done getting the remaining hosts for this loop 30583 1726853694.17245: getting the next task for host managed_node2 30583 1726853694.17253: done getting next task for host managed_node2 30583 1726853694.17255: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30583 1726853694.17260: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853694.17265: getting variables 30583 1726853694.17267: in VariableManager get_vars() 30583 1726853694.17302: Calling all_inventory to load vars for managed_node2 30583 1726853694.17305: Calling groups_inventory to load vars for managed_node2 30583 1726853694.17309: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853694.17322: Calling all_plugins_play to load vars for managed_node2 30583 1726853694.17325: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853694.17328: Calling groups_plugins_play to load vars for managed_node2 30583 1726853694.23693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853694.25264: done with get_vars() 30583 1726853694.25293: done getting variables 30583 1726853694.25348: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853694.25448: variable 'profile' from source: play vars 30583 1726853694.25452: variable 'interface' from source: play vars 30583 1726853694.25519: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:34:54 -0400 (0:00:00.100) 0:00:29.592 ****** 30583 1726853694.25549: entering _queue_task() for managed_node2/set_fact 30583 1726853694.25911: worker is 1 (out of 1 available) 30583 1726853694.25924: exiting _queue_task() for managed_node2/set_fact 30583 1726853694.25936: done queuing things up, now waiting for results queue to drain 30583 1726853694.25937: waiting for pending results... 30583 1726853694.26197: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 30583 1726853694.26345: in run() - task 02083763-bbaf-05ea-abc5-00000000094f 30583 1726853694.26372: variable 'ansible_search_path' from source: unknown 30583 1726853694.26382: variable 'ansible_search_path' from source: unknown 30583 1726853694.26428: calling self._execute() 30583 1726853694.26542: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.26554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.26618: variable 'omit' from source: magic vars 30583 1726853694.26931: variable 'ansible_distribution_major_version' from source: facts 30583 1726853694.26951: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853694.27076: variable 'profile_stat' from source: set_fact 30583 1726853694.27092: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853694.27102: when evaluation is False, skipping this task 30583 1726853694.27109: _execute() done 30583 1726853694.27115: dumping result to json 30583 1726853694.27159: done dumping result, returning 30583 1726853694.27163: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-00000000094f] 30583 1726853694.27165: sending task result for task 02083763-bbaf-05ea-abc5-00000000094f skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853694.27401: no more pending results, returning what we have 30583 1726853694.27409: results queue empty 30583 1726853694.27411: checking for any_errors_fatal 30583 1726853694.27421: done checking for any_errors_fatal 30583 1726853694.27422: checking for max_fail_percentage 30583 1726853694.27424: done checking for max_fail_percentage 30583 1726853694.27425: checking to see if all hosts have failed and the running result is not ok 30583 1726853694.27425: done checking to see if all hosts have failed 30583 1726853694.27426: getting the remaining hosts for this loop 30583 1726853694.27428: done getting the remaining hosts for this loop 30583 1726853694.27433: getting the next task for host managed_node2 30583 1726853694.27442: done getting next task for host managed_node2 30583 1726853694.27446: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30583 1726853694.27450: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853694.27454: getting variables 30583 1726853694.27455: in VariableManager get_vars() 30583 1726853694.27488: Calling all_inventory to load vars for managed_node2 30583 1726853694.27490: Calling groups_inventory to load vars for managed_node2 30583 1726853694.27494: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853694.27504: Calling all_plugins_play to load vars for managed_node2 30583 1726853694.27507: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853694.27510: Calling groups_plugins_play to load vars for managed_node2 30583 1726853694.28028: done sending task result for task 02083763-bbaf-05ea-abc5-00000000094f 30583 1726853694.28032: WORKER PROCESS EXITING 30583 1726853694.29721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853694.32397: done with get_vars() 30583 1726853694.32421: done getting variables 30583 1726853694.32484: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853694.32905: variable 'profile' from source: play vars 30583 1726853694.32909: variable 'interface' from source: play vars 30583 1726853694.32968: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:34:54 -0400 (0:00:00.074) 0:00:29.667 ****** 30583 1726853694.33000: entering _queue_task() for managed_node2/assert 30583 1726853694.33764: worker is 1 (out of 1 available) 30583 1726853694.33781: exiting _queue_task() for managed_node2/assert 30583 1726853694.33795: done queuing things up, now waiting for results queue to drain 30583 1726853694.33796: waiting for pending results... 30583 1726853694.34489: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' 30583 1726853694.34877: in run() - task 02083763-bbaf-05ea-abc5-0000000008ae 30583 1726853694.34881: variable 'ansible_search_path' from source: unknown 30583 1726853694.34884: variable 'ansible_search_path' from source: unknown 30583 1726853694.34887: calling self._execute() 30583 1726853694.34890: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.34892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.34895: variable 'omit' from source: magic vars 30583 1726853694.35628: variable 'ansible_distribution_major_version' from source: facts 30583 1726853694.36076: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853694.36080: variable 'omit' from source: magic vars 30583 1726853694.36082: variable 'omit' from source: magic vars 30583 1726853694.36084: variable 'profile' from source: play vars 30583 1726853694.36086: variable 'interface' from source: play vars 30583 1726853694.36133: variable 'interface' from source: play vars 30583 1726853694.36576: variable 'omit' from source: magic vars 30583 1726853694.36580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853694.36583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853694.36586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853694.36588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853694.36591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853694.36594: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853694.36596: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.36599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.36683: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853694.36887: Set connection var ansible_timeout to 10 30583 1726853694.36895: Set connection var ansible_connection to ssh 30583 1726853694.36905: Set connection var ansible_shell_executable to /bin/sh 30583 1726853694.36911: Set connection var ansible_shell_type to sh 30583 1726853694.36925: Set connection var ansible_pipelining to False 30583 1726853694.36957: variable 'ansible_shell_executable' from source: unknown 30583 1726853694.36966: variable 'ansible_connection' from source: unknown 30583 1726853694.36976: variable 'ansible_module_compression' from source: unknown 30583 1726853694.36984: variable 'ansible_shell_type' from source: unknown 30583 1726853694.36990: variable 'ansible_shell_executable' from source: unknown 30583 1726853694.36996: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.37004: variable 'ansible_pipelining' from source: unknown 30583 1726853694.37010: variable 'ansible_timeout' from source: unknown 30583 1726853694.37017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.37170: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853694.37391: variable 'omit' from source: magic vars 30583 1726853694.37402: starting attempt loop 30583 1726853694.37409: running the handler 30583 1726853694.37737: variable 'lsr_net_profile_exists' from source: set_fact 30583 1726853694.37748: Evaluated conditional (lsr_net_profile_exists): True 30583 1726853694.37761: handler run complete 30583 1726853694.37783: attempt loop complete, returning result 30583 1726853694.37791: _execute() done 30583 1726853694.37798: dumping result to json 30583 1726853694.37807: done dumping result, returning 30583 1726853694.37821: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' [02083763-bbaf-05ea-abc5-0000000008ae] 30583 1726853694.37831: sending task result for task 02083763-bbaf-05ea-abc5-0000000008ae ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853694.37985: no more pending results, returning what we have 30583 1726853694.37989: results queue empty 30583 1726853694.37990: checking for any_errors_fatal 30583 1726853694.37995: done checking for any_errors_fatal 30583 1726853694.37996: checking for max_fail_percentage 30583 1726853694.37998: done checking for max_fail_percentage 30583 1726853694.37999: checking to see if all hosts have failed and the running result is not ok 30583 1726853694.38000: done checking to see if all hosts have failed 30583 1726853694.38000: getting the remaining hosts for this loop 30583 1726853694.38002: done getting the remaining hosts for this loop 30583 1726853694.38006: getting the next task for host managed_node2 30583 1726853694.38014: done getting next task for host managed_node2 30583 1726853694.38017: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30583 1726853694.38020: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853694.38177: getting variables 30583 1726853694.38180: in VariableManager get_vars() 30583 1726853694.38217: Calling all_inventory to load vars for managed_node2 30583 1726853694.38220: Calling groups_inventory to load vars for managed_node2 30583 1726853694.38224: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853694.38238: Calling all_plugins_play to load vars for managed_node2 30583 1726853694.38241: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853694.38244: Calling groups_plugins_play to load vars for managed_node2 30583 1726853694.39603: done sending task result for task 02083763-bbaf-05ea-abc5-0000000008ae 30583 1726853694.39607: WORKER PROCESS EXITING 30583 1726853694.41536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853694.44708: done with get_vars() 30583 1726853694.44739: done getting variables 30583 1726853694.44908: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853694.45044: variable 'profile' from source: play vars 30583 1726853694.45048: variable 'interface' from source: play vars 30583 1726853694.45117: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:34:54 -0400 (0:00:00.121) 0:00:29.788 ****** 30583 1726853694.45153: entering _queue_task() for managed_node2/assert 30583 1726853694.45612: worker is 1 (out of 1 available) 30583 1726853694.45623: exiting _queue_task() for managed_node2/assert 30583 1726853694.45635: done queuing things up, now waiting for results queue to drain 30583 1726853694.45636: waiting for pending results... 30583 1726853694.46154: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' 30583 1726853694.46463: in run() - task 02083763-bbaf-05ea-abc5-0000000008af 30583 1726853694.46488: variable 'ansible_search_path' from source: unknown 30583 1726853694.46496: variable 'ansible_search_path' from source: unknown 30583 1726853694.46549: calling self._execute() 30583 1726853694.46681: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.46693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.46708: variable 'omit' from source: magic vars 30583 1726853694.47114: variable 'ansible_distribution_major_version' from source: facts 30583 1726853694.47132: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853694.47145: variable 'omit' from source: magic vars 30583 1726853694.47211: variable 'omit' from source: magic vars 30583 1726853694.47323: variable 'profile' from source: play vars 30583 1726853694.47334: variable 'interface' from source: play vars 30583 1726853694.47413: variable 'interface' from source: play vars 30583 1726853694.47439: variable 'omit' from source: magic vars 30583 1726853694.47488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853694.47535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853694.47625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853694.47628: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853694.47631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853694.47644: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853694.47651: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.47663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.47781: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853694.47794: Set connection var ansible_timeout to 10 30583 1726853694.47801: Set connection var ansible_connection to ssh 30583 1726853694.47811: Set connection var ansible_shell_executable to /bin/sh 30583 1726853694.47817: Set connection var ansible_shell_type to sh 30583 1726853694.47832: Set connection var ansible_pipelining to False 30583 1726853694.47879: variable 'ansible_shell_executable' from source: unknown 30583 1726853694.47882: variable 'ansible_connection' from source: unknown 30583 1726853694.47888: variable 'ansible_module_compression' from source: unknown 30583 1726853694.47896: variable 'ansible_shell_type' from source: unknown 30583 1726853694.47902: variable 'ansible_shell_executable' from source: unknown 30583 1726853694.47909: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.47917: variable 'ansible_pipelining' from source: unknown 30583 1726853694.47924: variable 'ansible_timeout' from source: unknown 30583 1726853694.47931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.48101: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853694.48169: variable 'omit' from source: magic vars 30583 1726853694.48175: starting attempt loop 30583 1726853694.48177: running the handler 30583 1726853694.48258: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30583 1726853694.48270: Evaluated conditional (lsr_net_profile_ansible_managed): True 30583 1726853694.48287: handler run complete 30583 1726853694.48306: attempt loop complete, returning result 30583 1726853694.48319: _execute() done 30583 1726853694.48387: dumping result to json 30583 1726853694.48391: done dumping result, returning 30583 1726853694.48393: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' [02083763-bbaf-05ea-abc5-0000000008af] 30583 1726853694.48395: sending task result for task 02083763-bbaf-05ea-abc5-0000000008af 30583 1726853694.48576: done sending task result for task 02083763-bbaf-05ea-abc5-0000000008af 30583 1726853694.48580: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853694.48626: no more pending results, returning what we have 30583 1726853694.48629: results queue empty 30583 1726853694.48630: checking for any_errors_fatal 30583 1726853694.48639: done checking for any_errors_fatal 30583 1726853694.48640: checking for max_fail_percentage 30583 1726853694.48642: done checking for max_fail_percentage 30583 1726853694.48643: checking to see if all hosts have failed and the running result is not ok 30583 1726853694.48643: done checking to see if all hosts have failed 30583 1726853694.48644: getting the remaining hosts for this loop 30583 1726853694.48646: done getting the remaining hosts for this loop 30583 1726853694.48649: getting the next task for host managed_node2 30583 1726853694.48659: done getting next task for host managed_node2 30583 1726853694.48662: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30583 1726853694.48666: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853694.48672: getting variables 30583 1726853694.48674: in VariableManager get_vars() 30583 1726853694.48710: Calling all_inventory to load vars for managed_node2 30583 1726853694.48713: Calling groups_inventory to load vars for managed_node2 30583 1726853694.48717: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853694.48729: Calling all_plugins_play to load vars for managed_node2 30583 1726853694.48733: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853694.48736: Calling groups_plugins_play to load vars for managed_node2 30583 1726853694.52302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853694.55520: done with get_vars() 30583 1726853694.55548: done getting variables 30583 1726853694.55815: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853694.55929: variable 'profile' from source: play vars 30583 1726853694.56177: variable 'interface' from source: play vars 30583 1726853694.56248: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:34:54 -0400 (0:00:00.111) 0:00:29.900 ****** 30583 1726853694.56288: entering _queue_task() for managed_node2/assert 30583 1726853694.57038: worker is 1 (out of 1 available) 30583 1726853694.57052: exiting _queue_task() for managed_node2/assert 30583 1726853694.57066: done queuing things up, now waiting for results queue to drain 30583 1726853694.57067: waiting for pending results... 30583 1726853694.57651: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr 30583 1726853694.57805: in run() - task 02083763-bbaf-05ea-abc5-0000000008b0 30583 1726853694.57809: variable 'ansible_search_path' from source: unknown 30583 1726853694.57812: variable 'ansible_search_path' from source: unknown 30583 1726853694.57913: calling self._execute() 30583 1726853694.58006: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.58009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.58020: variable 'omit' from source: magic vars 30583 1726853694.58389: variable 'ansible_distribution_major_version' from source: facts 30583 1726853694.58400: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853694.58417: variable 'omit' from source: magic vars 30583 1726853694.58462: variable 'omit' from source: magic vars 30583 1726853694.58557: variable 'profile' from source: play vars 30583 1726853694.58563: variable 'interface' from source: play vars 30583 1726853694.58625: variable 'interface' from source: play vars 30583 1726853694.58645: variable 'omit' from source: magic vars 30583 1726853694.58688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853694.58721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853694.58741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853694.58757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853694.58776: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853694.58805: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853694.58808: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.58811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.58909: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853694.58915: Set connection var ansible_timeout to 10 30583 1726853694.58918: Set connection var ansible_connection to ssh 30583 1726853694.58923: Set connection var ansible_shell_executable to /bin/sh 30583 1726853694.58925: Set connection var ansible_shell_type to sh 30583 1726853694.58935: Set connection var ansible_pipelining to False 30583 1726853694.58961: variable 'ansible_shell_executable' from source: unknown 30583 1726853694.58964: variable 'ansible_connection' from source: unknown 30583 1726853694.58967: variable 'ansible_module_compression' from source: unknown 30583 1726853694.58969: variable 'ansible_shell_type' from source: unknown 30583 1726853694.58973: variable 'ansible_shell_executable' from source: unknown 30583 1726853694.58975: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.59002: variable 'ansible_pipelining' from source: unknown 30583 1726853694.59005: variable 'ansible_timeout' from source: unknown 30583 1726853694.59008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.59119: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853694.59175: variable 'omit' from source: magic vars 30583 1726853694.59179: starting attempt loop 30583 1726853694.59181: running the handler 30583 1726853694.59242: variable 'lsr_net_profile_fingerprint' from source: set_fact 30583 1726853694.59246: Evaluated conditional (lsr_net_profile_fingerprint): True 30583 1726853694.59253: handler run complete 30583 1726853694.59270: attempt loop complete, returning result 30583 1726853694.59275: _execute() done 30583 1726853694.59278: dumping result to json 30583 1726853694.59281: done dumping result, returning 30583 1726853694.59287: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr [02083763-bbaf-05ea-abc5-0000000008b0] 30583 1726853694.59329: sending task result for task 02083763-bbaf-05ea-abc5-0000000008b0 30583 1726853694.59419: done sending task result for task 02083763-bbaf-05ea-abc5-0000000008b0 30583 1726853694.59422: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853694.59478: no more pending results, returning what we have 30583 1726853694.59482: results queue empty 30583 1726853694.59483: checking for any_errors_fatal 30583 1726853694.59491: done checking for any_errors_fatal 30583 1726853694.59491: checking for max_fail_percentage 30583 1726853694.59493: done checking for max_fail_percentage 30583 1726853694.59494: checking to see if all hosts have failed and the running result is not ok 30583 1726853694.59495: done checking to see if all hosts have failed 30583 1726853694.59496: getting the remaining hosts for this loop 30583 1726853694.59498: done getting the remaining hosts for this loop 30583 1726853694.59501: getting the next task for host managed_node2 30583 1726853694.59509: done getting next task for host managed_node2 30583 1726853694.59519: ^ task is: TASK: Conditional asserts 30583 1726853694.59523: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853694.59527: getting variables 30583 1726853694.59529: in VariableManager get_vars() 30583 1726853694.59558: Calling all_inventory to load vars for managed_node2 30583 1726853694.59560: Calling groups_inventory to load vars for managed_node2 30583 1726853694.59564: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853694.59575: Calling all_plugins_play to load vars for managed_node2 30583 1726853694.59578: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853694.59580: Calling groups_plugins_play to load vars for managed_node2 30583 1726853694.61507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853694.63215: done with get_vars() 30583 1726853694.63245: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 13:34:54 -0400 (0:00:00.070) 0:00:29.970 ****** 30583 1726853694.63346: entering _queue_task() for managed_node2/include_tasks 30583 1726853694.63703: worker is 1 (out of 1 available) 30583 1726853694.63716: exiting _queue_task() for managed_node2/include_tasks 30583 1726853694.63729: done queuing things up, now waiting for results queue to drain 30583 1726853694.63731: waiting for pending results... 30583 1726853694.64118: running TaskExecutor() for managed_node2/TASK: Conditional asserts 30583 1726853694.64194: in run() - task 02083763-bbaf-05ea-abc5-0000000005ba 30583 1726853694.64198: variable 'ansible_search_path' from source: unknown 30583 1726853694.64201: variable 'ansible_search_path' from source: unknown 30583 1726853694.64420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853694.66292: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853694.66334: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853694.66369: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853694.66439: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853694.66442: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853694.66509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853694.66544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853694.66653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853694.66659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853694.66662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853694.66751: dumping result to json 30583 1726853694.66754: done dumping result, returning 30583 1726853694.66760: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [02083763-bbaf-05ea-abc5-0000000005ba] 30583 1726853694.66766: sending task result for task 02083763-bbaf-05ea-abc5-0000000005ba 30583 1726853694.66869: done sending task result for task 02083763-bbaf-05ea-abc5-0000000005ba 30583 1726853694.66874: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } 30583 1726853694.66924: no more pending results, returning what we have 30583 1726853694.66927: results queue empty 30583 1726853694.66928: checking for any_errors_fatal 30583 1726853694.66934: done checking for any_errors_fatal 30583 1726853694.66935: checking for max_fail_percentage 30583 1726853694.66937: done checking for max_fail_percentage 30583 1726853694.66937: checking to see if all hosts have failed and the running result is not ok 30583 1726853694.66938: done checking to see if all hosts have failed 30583 1726853694.66939: getting the remaining hosts for this loop 30583 1726853694.66941: done getting the remaining hosts for this loop 30583 1726853694.66944: getting the next task for host managed_node2 30583 1726853694.66951: done getting next task for host managed_node2 30583 1726853694.66953: ^ task is: TASK: Success in test '{{ lsr_description }}' 30583 1726853694.66959: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853694.66962: getting variables 30583 1726853694.66964: in VariableManager get_vars() 30583 1726853694.66998: Calling all_inventory to load vars for managed_node2 30583 1726853694.67000: Calling groups_inventory to load vars for managed_node2 30583 1726853694.67004: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853694.67013: Calling all_plugins_play to load vars for managed_node2 30583 1726853694.67016: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853694.67019: Calling groups_plugins_play to load vars for managed_node2 30583 1726853694.68393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853694.70031: done with get_vars() 30583 1726853694.70056: done getting variables 30583 1726853694.70126: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853694.70269: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile without autoconnect'] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 13:34:54 -0400 (0:00:00.069) 0:00:30.040 ****** 30583 1726853694.70302: entering _queue_task() for managed_node2/debug 30583 1726853694.70686: worker is 1 (out of 1 available) 30583 1726853694.70794: exiting _queue_task() for managed_node2/debug 30583 1726853694.70811: done queuing things up, now waiting for results queue to drain 30583 1726853694.70813: waiting for pending results... 30583 1726853694.71146: running TaskExecutor() for managed_node2/TASK: Success in test 'I can create a profile without autoconnect' 30583 1726853694.71151: in run() - task 02083763-bbaf-05ea-abc5-0000000005bb 30583 1726853694.71181: variable 'ansible_search_path' from source: unknown 30583 1726853694.71195: variable 'ansible_search_path' from source: unknown 30583 1726853694.71243: calling self._execute() 30583 1726853694.71357: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.71375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.71466: variable 'omit' from source: magic vars 30583 1726853694.71845: variable 'ansible_distribution_major_version' from source: facts 30583 1726853694.71868: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853694.71885: variable 'omit' from source: magic vars 30583 1726853694.71940: variable 'omit' from source: magic vars 30583 1726853694.72058: variable 'lsr_description' from source: include params 30583 1726853694.72088: variable 'omit' from source: magic vars 30583 1726853694.72141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853694.72228: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853694.72231: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853694.72244: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853694.72264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853694.72302: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853694.72311: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.72318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.72433: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853694.72452: Set connection var ansible_timeout to 10 30583 1726853694.72464: Set connection var ansible_connection to ssh 30583 1726853694.72554: Set connection var ansible_shell_executable to /bin/sh 30583 1726853694.72560: Set connection var ansible_shell_type to sh 30583 1726853694.72562: Set connection var ansible_pipelining to False 30583 1726853694.72565: variable 'ansible_shell_executable' from source: unknown 30583 1726853694.72567: variable 'ansible_connection' from source: unknown 30583 1726853694.72569: variable 'ansible_module_compression' from source: unknown 30583 1726853694.72573: variable 'ansible_shell_type' from source: unknown 30583 1726853694.72575: variable 'ansible_shell_executable' from source: unknown 30583 1726853694.72578: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.72580: variable 'ansible_pipelining' from source: unknown 30583 1726853694.72586: variable 'ansible_timeout' from source: unknown 30583 1726853694.72588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.72763: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853694.72797: variable 'omit' from source: magic vars 30583 1726853694.72817: starting attempt loop 30583 1726853694.72825: running the handler 30583 1726853694.72976: handler run complete 30583 1726853694.72981: attempt loop complete, returning result 30583 1726853694.73013: _execute() done 30583 1726853694.73015: dumping result to json 30583 1726853694.73017: done dumping result, returning 30583 1726853694.73019: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can create a profile without autoconnect' [02083763-bbaf-05ea-abc5-0000000005bb] 30583 1726853694.73020: sending task result for task 02083763-bbaf-05ea-abc5-0000000005bb 30583 1726853694.73086: done sending task result for task 02083763-bbaf-05ea-abc5-0000000005bb 30583 1726853694.73089: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'I can create a profile without autoconnect' +++++ 30583 1726853694.73138: no more pending results, returning what we have 30583 1726853694.73147: results queue empty 30583 1726853694.73148: checking for any_errors_fatal 30583 1726853694.73157: done checking for any_errors_fatal 30583 1726853694.73158: checking for max_fail_percentage 30583 1726853694.73160: done checking for max_fail_percentage 30583 1726853694.73161: checking to see if all hosts have failed and the running result is not ok 30583 1726853694.73162: done checking to see if all hosts have failed 30583 1726853694.73163: getting the remaining hosts for this loop 30583 1726853694.73165: done getting the remaining hosts for this loop 30583 1726853694.73169: getting the next task for host managed_node2 30583 1726853694.73181: done getting next task for host managed_node2 30583 1726853694.73188: ^ task is: TASK: Cleanup 30583 1726853694.73191: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853694.73198: getting variables 30583 1726853694.73200: in VariableManager get_vars() 30583 1726853694.73234: Calling all_inventory to load vars for managed_node2 30583 1726853694.73237: Calling groups_inventory to load vars for managed_node2 30583 1726853694.73241: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853694.73252: Calling all_plugins_play to load vars for managed_node2 30583 1726853694.73259: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853694.73262: Calling groups_plugins_play to load vars for managed_node2 30583 1726853694.75517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853694.77425: done with get_vars() 30583 1726853694.77453: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 13:34:54 -0400 (0:00:00.072) 0:00:30.112 ****** 30583 1726853694.77555: entering _queue_task() for managed_node2/include_tasks 30583 1726853694.77998: worker is 1 (out of 1 available) 30583 1726853694.78011: exiting _queue_task() for managed_node2/include_tasks 30583 1726853694.78028: done queuing things up, now waiting for results queue to drain 30583 1726853694.78030: waiting for pending results... 30583 1726853694.78790: running TaskExecutor() for managed_node2/TASK: Cleanup 30583 1726853694.78797: in run() - task 02083763-bbaf-05ea-abc5-0000000005bf 30583 1726853694.79178: variable 'ansible_search_path' from source: unknown 30583 1726853694.79182: variable 'ansible_search_path' from source: unknown 30583 1726853694.79185: variable 'lsr_cleanup' from source: include params 30583 1726853694.79576: variable 'lsr_cleanup' from source: include params 30583 1726853694.79580: variable 'omit' from source: magic vars 30583 1726853694.79807: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.79822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.79838: variable 'omit' from source: magic vars 30583 1726853694.80099: variable 'ansible_distribution_major_version' from source: facts 30583 1726853694.80115: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853694.80126: variable 'item' from source: unknown 30583 1726853694.80196: variable 'item' from source: unknown 30583 1726853694.80235: variable 'item' from source: unknown 30583 1726853694.80301: variable 'item' from source: unknown 30583 1726853694.80449: dumping result to json 30583 1726853694.80462: done dumping result, returning 30583 1726853694.80477: done running TaskExecutor() for managed_node2/TASK: Cleanup [02083763-bbaf-05ea-abc5-0000000005bf] 30583 1726853694.80486: sending task result for task 02083763-bbaf-05ea-abc5-0000000005bf 30583 1726853694.80573: no more pending results, returning what we have 30583 1726853694.80580: in VariableManager get_vars() 30583 1726853694.80619: Calling all_inventory to load vars for managed_node2 30583 1726853694.80622: Calling groups_inventory to load vars for managed_node2 30583 1726853694.80626: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853694.80644: Calling all_plugins_play to load vars for managed_node2 30583 1726853694.80651: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853694.80654: Calling groups_plugins_play to load vars for managed_node2 30583 1726853694.81316: done sending task result for task 02083763-bbaf-05ea-abc5-0000000005bf 30583 1726853694.81320: WORKER PROCESS EXITING 30583 1726853694.82227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853694.83818: done with get_vars() 30583 1726853694.83838: variable 'ansible_search_path' from source: unknown 30583 1726853694.83839: variable 'ansible_search_path' from source: unknown 30583 1726853694.83880: we have included files to process 30583 1726853694.83881: generating all_blocks data 30583 1726853694.83884: done generating all_blocks data 30583 1726853694.83888: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853694.83889: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853694.83891: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853694.84105: done processing included file 30583 1726853694.84107: iterating over new_blocks loaded from include file 30583 1726853694.84109: in VariableManager get_vars() 30583 1726853694.84188: done with get_vars() 30583 1726853694.84190: filtering new block on tags 30583 1726853694.84215: done filtering new block on tags 30583 1726853694.84218: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 30583 1726853694.84222: extending task lists for all hosts with included blocks 30583 1726853694.86533: done extending task lists 30583 1726853694.86535: done processing included files 30583 1726853694.86535: results queue empty 30583 1726853694.86536: checking for any_errors_fatal 30583 1726853694.86539: done checking for any_errors_fatal 30583 1726853694.86540: checking for max_fail_percentage 30583 1726853694.86541: done checking for max_fail_percentage 30583 1726853694.86542: checking to see if all hosts have failed and the running result is not ok 30583 1726853694.86543: done checking to see if all hosts have failed 30583 1726853694.86543: getting the remaining hosts for this loop 30583 1726853694.86545: done getting the remaining hosts for this loop 30583 1726853694.86547: getting the next task for host managed_node2 30583 1726853694.86552: done getting next task for host managed_node2 30583 1726853694.86554: ^ task is: TASK: Cleanup profile and device 30583 1726853694.86560: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853694.86562: getting variables 30583 1726853694.86563: in VariableManager get_vars() 30583 1726853694.86579: Calling all_inventory to load vars for managed_node2 30583 1726853694.86582: Calling groups_inventory to load vars for managed_node2 30583 1726853694.86584: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853694.86590: Calling all_plugins_play to load vars for managed_node2 30583 1726853694.86593: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853694.86596: Calling groups_plugins_play to load vars for managed_node2 30583 1726853694.88259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853694.90467: done with get_vars() 30583 1726853694.90494: done getting variables 30583 1726853694.90544: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 13:34:54 -0400 (0:00:00.130) 0:00:30.243 ****** 30583 1726853694.90581: entering _queue_task() for managed_node2/shell 30583 1726853694.91189: worker is 1 (out of 1 available) 30583 1726853694.91199: exiting _queue_task() for managed_node2/shell 30583 1726853694.91209: done queuing things up, now waiting for results queue to drain 30583 1726853694.91210: waiting for pending results... 30583 1726853694.91343: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 30583 1726853694.91445: in run() - task 02083763-bbaf-05ea-abc5-0000000009a0 30583 1726853694.91472: variable 'ansible_search_path' from source: unknown 30583 1726853694.91549: variable 'ansible_search_path' from source: unknown 30583 1726853694.91553: calling self._execute() 30583 1726853694.91622: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.91634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.91654: variable 'omit' from source: magic vars 30583 1726853694.92065: variable 'ansible_distribution_major_version' from source: facts 30583 1726853694.92090: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853694.92106: variable 'omit' from source: magic vars 30583 1726853694.92152: variable 'omit' from source: magic vars 30583 1726853694.92324: variable 'interface' from source: play vars 30583 1726853694.92377: variable 'omit' from source: magic vars 30583 1726853694.92395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853694.92439: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853694.92460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853694.92675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853694.92679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853694.92681: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853694.92683: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.92685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.92687: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853694.92689: Set connection var ansible_timeout to 10 30583 1726853694.92691: Set connection var ansible_connection to ssh 30583 1726853694.92693: Set connection var ansible_shell_executable to /bin/sh 30583 1726853694.92695: Set connection var ansible_shell_type to sh 30583 1726853694.92696: Set connection var ansible_pipelining to False 30583 1726853694.92702: variable 'ansible_shell_executable' from source: unknown 30583 1726853694.92704: variable 'ansible_connection' from source: unknown 30583 1726853694.92707: variable 'ansible_module_compression' from source: unknown 30583 1726853694.92711: variable 'ansible_shell_type' from source: unknown 30583 1726853694.92713: variable 'ansible_shell_executable' from source: unknown 30583 1726853694.92715: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853694.92720: variable 'ansible_pipelining' from source: unknown 30583 1726853694.92722: variable 'ansible_timeout' from source: unknown 30583 1726853694.92726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853694.92885: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853694.92896: variable 'omit' from source: magic vars 30583 1726853694.92902: starting attempt loop 30583 1726853694.92905: running the handler 30583 1726853694.92915: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853694.92935: _low_level_execute_command(): starting 30583 1726853694.92942: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853694.93732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853694.93744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853694.93756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853694.93777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853694.93791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853694.93798: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853694.93809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853694.93831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853694.93841: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853694.93848: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853694.93942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853694.93953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853694.93968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853694.93991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853694.94101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853694.95844: stdout chunk (state=3): >>>/root <<< 30583 1726853694.96177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853694.96180: stdout chunk (state=3): >>><<< 30583 1726853694.96183: stderr chunk (state=3): >>><<< 30583 1726853694.96190: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853694.96221: _low_level_execute_command(): starting 30583 1726853694.96225: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495 `" && echo ansible-tmp-1726853694.961901-31988-64422512453495="` echo /root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495 `" ) && sleep 0' 30583 1726853694.97355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853694.97405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853694.97422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853694.97445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853694.97559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853694.97642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853694.97687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853694.97792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853694.99837: stdout chunk (state=3): >>>ansible-tmp-1726853694.961901-31988-64422512453495=/root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495 <<< 30583 1726853694.99977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853695.00006: stdout chunk (state=3): >>><<< 30583 1726853695.00009: stderr chunk (state=3): >>><<< 30583 1726853695.00026: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853694.961901-31988-64422512453495=/root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853695.00076: variable 'ansible_module_compression' from source: unknown 30583 1726853695.00131: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853695.00342: variable 'ansible_facts' from source: unknown 30583 1726853695.00345: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495/AnsiballZ_command.py 30583 1726853695.00487: Sending initial data 30583 1726853695.00491: Sent initial data (154 bytes) 30583 1726853695.01027: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853695.01044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853695.01059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853695.01126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853695.01183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853695.01200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853695.01229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853695.01496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853695.03146: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853695.03214: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853695.03294: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpbudh1pga /root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495/AnsiballZ_command.py <<< 30583 1726853695.03304: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495/AnsiballZ_command.py" <<< 30583 1726853695.03361: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpbudh1pga" to remote "/root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495/AnsiballZ_command.py" <<< 30583 1726853695.04755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853695.04766: stdout chunk (state=3): >>><<< 30583 1726853695.04780: stderr chunk (state=3): >>><<< 30583 1726853695.04843: done transferring module to remote 30583 1726853695.04888: _low_level_execute_command(): starting 30583 1726853695.04899: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495/ /root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495/AnsiballZ_command.py && sleep 0' 30583 1726853695.06154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853695.06375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853695.06484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853695.06587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853695.08550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853695.08561: stdout chunk (state=3): >>><<< 30583 1726853695.08575: stderr chunk (state=3): >>><<< 30583 1726853695.08867: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853695.08874: _low_level_execute_command(): starting 30583 1726853695.08877: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495/AnsiballZ_command.py && sleep 0' 30583 1726853695.10019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853695.10022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853695.10025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853695.10027: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853695.10029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853695.10031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853695.10125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853695.10316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853695.10447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853695.29645: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (082d2e42-0ca8-4d06-a689-24a49f64d485) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 13:34:55.260979", "end": "2024-09-20 13:34:55.295320", "delta": "0:00:00.034341", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853695.31429: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. <<< 30583 1726853695.31462: stderr chunk (state=3): >>><<< 30583 1726853695.31479: stdout chunk (state=3): >>><<< 30583 1726853695.31713: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (082d2e42-0ca8-4d06-a689-24a49f64d485) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 13:34:55.260979", "end": "2024-09-20 13:34:55.295320", "delta": "0:00:00.034341", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. 30583 1726853695.31716: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853695.31720: _low_level_execute_command(): starting 30583 1726853695.31726: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853694.961901-31988-64422512453495/ > /dev/null 2>&1 && sleep 0' 30583 1726853695.32716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853695.32730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853695.32739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853695.32997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853695.33050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853695.33100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853695.35017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853695.35114: stderr chunk (state=3): >>><<< 30583 1726853695.35117: stdout chunk (state=3): >>><<< 30583 1726853695.35120: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853695.35276: handler run complete 30583 1726853695.35279: Evaluated conditional (False): False 30583 1726853695.35281: attempt loop complete, returning result 30583 1726853695.35283: _execute() done 30583 1726853695.35285: dumping result to json 30583 1726853695.35287: done dumping result, returning 30583 1726853695.35289: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [02083763-bbaf-05ea-abc5-0000000009a0] 30583 1726853695.35291: sending task result for task 02083763-bbaf-05ea-abc5-0000000009a0 30583 1726853695.35363: done sending task result for task 02083763-bbaf-05ea-abc5-0000000009a0 30583 1726853695.35366: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.034341", "end": "2024-09-20 13:34:55.295320", "rc": 1, "start": "2024-09-20 13:34:55.260979" } STDOUT: Connection 'statebr' (082d2e42-0ca8-4d06-a689-24a49f64d485) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30583 1726853695.35440: no more pending results, returning what we have 30583 1726853695.35445: results queue empty 30583 1726853695.35446: checking for any_errors_fatal 30583 1726853695.35448: done checking for any_errors_fatal 30583 1726853695.35448: checking for max_fail_percentage 30583 1726853695.35450: done checking for max_fail_percentage 30583 1726853695.35451: checking to see if all hosts have failed and the running result is not ok 30583 1726853695.35452: done checking to see if all hosts have failed 30583 1726853695.35453: getting the remaining hosts for this loop 30583 1726853695.35455: done getting the remaining hosts for this loop 30583 1726853695.35459: getting the next task for host managed_node2 30583 1726853695.35475: done getting next task for host managed_node2 30583 1726853695.35479: ^ task is: TASK: Include the task 'run_test.yml' 30583 1726853695.35481: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853695.35486: getting variables 30583 1726853695.35488: in VariableManager get_vars() 30583 1726853695.35524: Calling all_inventory to load vars for managed_node2 30583 1726853695.35527: Calling groups_inventory to load vars for managed_node2 30583 1726853695.35531: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853695.35543: Calling all_plugins_play to load vars for managed_node2 30583 1726853695.35547: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853695.35551: Calling groups_plugins_play to load vars for managed_node2 30583 1726853695.38650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853695.42051: done with get_vars() 30583 1726853695.42080: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:65 Friday 20 September 2024 13:34:55 -0400 (0:00:00.517) 0:00:30.760 ****** 30583 1726853695.42290: entering _queue_task() for managed_node2/include_tasks 30583 1726853695.42988: worker is 1 (out of 1 available) 30583 1726853695.43001: exiting _queue_task() for managed_node2/include_tasks 30583 1726853695.43013: done queuing things up, now waiting for results queue to drain 30583 1726853695.43015: waiting for pending results... 30583 1726853695.43794: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 30583 1726853695.43802: in run() - task 02083763-bbaf-05ea-abc5-000000000011 30583 1726853695.43805: variable 'ansible_search_path' from source: unknown 30583 1726853695.43988: calling self._execute() 30583 1726853695.44081: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.44087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.44099: variable 'omit' from source: magic vars 30583 1726853695.44779: variable 'ansible_distribution_major_version' from source: facts 30583 1726853695.44785: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853695.44791: _execute() done 30583 1726853695.44795: dumping result to json 30583 1726853695.44797: done dumping result, returning 30583 1726853695.44804: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [02083763-bbaf-05ea-abc5-000000000011] 30583 1726853695.44808: sending task result for task 02083763-bbaf-05ea-abc5-000000000011 30583 1726853695.45019: no more pending results, returning what we have 30583 1726853695.45025: in VariableManager get_vars() 30583 1726853695.45066: Calling all_inventory to load vars for managed_node2 30583 1726853695.45069: Calling groups_inventory to load vars for managed_node2 30583 1726853695.45076: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853695.45089: Calling all_plugins_play to load vars for managed_node2 30583 1726853695.45091: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853695.45094: Calling groups_plugins_play to load vars for managed_node2 30583 1726853695.45684: done sending task result for task 02083763-bbaf-05ea-abc5-000000000011 30583 1726853695.45687: WORKER PROCESS EXITING 30583 1726853695.48090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853695.51425: done with get_vars() 30583 1726853695.51451: variable 'ansible_search_path' from source: unknown 30583 1726853695.51577: we have included files to process 30583 1726853695.51582: generating all_blocks data 30583 1726853695.51585: done generating all_blocks data 30583 1726853695.51590: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853695.51591: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853695.51595: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853695.52720: in VariableManager get_vars() 30583 1726853695.52738: done with get_vars() 30583 1726853695.52918: in VariableManager get_vars() 30583 1726853695.52936: done with get_vars() 30583 1726853695.52977: in VariableManager get_vars() 30583 1726853695.53108: done with get_vars() 30583 1726853695.53153: in VariableManager get_vars() 30583 1726853695.53170: done with get_vars() 30583 1726853695.53275: in VariableManager get_vars() 30583 1726853695.53291: done with get_vars() 30583 1726853695.54253: in VariableManager get_vars() 30583 1726853695.54269: done with get_vars() 30583 1726853695.54284: done processing included file 30583 1726853695.54286: iterating over new_blocks loaded from include file 30583 1726853695.54287: in VariableManager get_vars() 30583 1726853695.54299: done with get_vars() 30583 1726853695.54301: filtering new block on tags 30583 1726853695.54589: done filtering new block on tags 30583 1726853695.54592: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 30583 1726853695.54598: extending task lists for all hosts with included blocks 30583 1726853695.54634: done extending task lists 30583 1726853695.54635: done processing included files 30583 1726853695.54636: results queue empty 30583 1726853695.54637: checking for any_errors_fatal 30583 1726853695.54758: done checking for any_errors_fatal 30583 1726853695.54760: checking for max_fail_percentage 30583 1726853695.54762: done checking for max_fail_percentage 30583 1726853695.54762: checking to see if all hosts have failed and the running result is not ok 30583 1726853695.54763: done checking to see if all hosts have failed 30583 1726853695.54764: getting the remaining hosts for this loop 30583 1726853695.54765: done getting the remaining hosts for this loop 30583 1726853695.54768: getting the next task for host managed_node2 30583 1726853695.54774: done getting next task for host managed_node2 30583 1726853695.54776: ^ task is: TASK: TEST: {{ lsr_description }} 30583 1726853695.54778: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853695.54781: getting variables 30583 1726853695.54782: in VariableManager get_vars() 30583 1726853695.54790: Calling all_inventory to load vars for managed_node2 30583 1726853695.54792: Calling groups_inventory to load vars for managed_node2 30583 1726853695.54794: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853695.54800: Calling all_plugins_play to load vars for managed_node2 30583 1726853695.54802: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853695.54805: Calling groups_plugins_play to load vars for managed_node2 30583 1726853695.56611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853695.61629: done with get_vars() 30583 1726853695.61652: done getting variables 30583 1726853695.61699: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853695.62229: variable 'lsr_description' from source: include params TASK [TEST: I can activate an existing profile] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 13:34:55 -0400 (0:00:00.199) 0:00:30.960 ****** 30583 1726853695.62289: entering _queue_task() for managed_node2/debug 30583 1726853695.62994: worker is 1 (out of 1 available) 30583 1726853695.63121: exiting _queue_task() for managed_node2/debug 30583 1726853695.63134: done queuing things up, now waiting for results queue to drain 30583 1726853695.63135: waiting for pending results... 30583 1726853695.63615: running TaskExecutor() for managed_node2/TASK: TEST: I can activate an existing profile 30583 1726853695.63987: in run() - task 02083763-bbaf-05ea-abc5-000000000a49 30583 1726853695.63991: variable 'ansible_search_path' from source: unknown 30583 1726853695.63993: variable 'ansible_search_path' from source: unknown 30583 1726853695.63996: calling self._execute() 30583 1726853695.64279: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.64282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.64284: variable 'omit' from source: magic vars 30583 1726853695.65040: variable 'ansible_distribution_major_version' from source: facts 30583 1726853695.65060: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853695.65079: variable 'omit' from source: magic vars 30583 1726853695.65122: variable 'omit' from source: magic vars 30583 1726853695.65338: variable 'lsr_description' from source: include params 30583 1726853695.65414: variable 'omit' from source: magic vars 30583 1726853695.65468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853695.65747: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853695.65751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853695.65753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.65755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.65838: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853695.65964: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.65967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.66143: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853695.66155: Set connection var ansible_timeout to 10 30583 1726853695.66161: Set connection var ansible_connection to ssh 30583 1726853695.66170: Set connection var ansible_shell_executable to /bin/sh 30583 1726853695.66185: Set connection var ansible_shell_type to sh 30583 1726853695.66200: Set connection var ansible_pipelining to False 30583 1726853695.66263: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.66292: variable 'ansible_connection' from source: unknown 30583 1726853695.66300: variable 'ansible_module_compression' from source: unknown 30583 1726853695.66395: variable 'ansible_shell_type' from source: unknown 30583 1726853695.66398: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.66401: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.66403: variable 'ansible_pipelining' from source: unknown 30583 1726853695.66405: variable 'ansible_timeout' from source: unknown 30583 1726853695.66407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.66641: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853695.66777: variable 'omit' from source: magic vars 30583 1726853695.66781: starting attempt loop 30583 1726853695.66783: running the handler 30583 1726853695.66847: handler run complete 30583 1726853695.66867: attempt loop complete, returning result 30583 1726853695.66890: _execute() done 30583 1726853695.66995: dumping result to json 30583 1726853695.66998: done dumping result, returning 30583 1726853695.67001: done running TaskExecutor() for managed_node2/TASK: TEST: I can activate an existing profile [02083763-bbaf-05ea-abc5-000000000a49] 30583 1726853695.67003: sending task result for task 02083763-bbaf-05ea-abc5-000000000a49 30583 1726853695.67377: done sending task result for task 02083763-bbaf-05ea-abc5-000000000a49 30583 1726853695.67384: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I can activate an existing profile ########## 30583 1726853695.67436: no more pending results, returning what we have 30583 1726853695.67440: results queue empty 30583 1726853695.67442: checking for any_errors_fatal 30583 1726853695.67443: done checking for any_errors_fatal 30583 1726853695.67444: checking for max_fail_percentage 30583 1726853695.67446: done checking for max_fail_percentage 30583 1726853695.67446: checking to see if all hosts have failed and the running result is not ok 30583 1726853695.67447: done checking to see if all hosts have failed 30583 1726853695.67448: getting the remaining hosts for this loop 30583 1726853695.67450: done getting the remaining hosts for this loop 30583 1726853695.67454: getting the next task for host managed_node2 30583 1726853695.67463: done getting next task for host managed_node2 30583 1726853695.67466: ^ task is: TASK: Show item 30583 1726853695.67469: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853695.67475: getting variables 30583 1726853695.67477: in VariableManager get_vars() 30583 1726853695.67602: Calling all_inventory to load vars for managed_node2 30583 1726853695.67606: Calling groups_inventory to load vars for managed_node2 30583 1726853695.67611: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853695.67622: Calling all_plugins_play to load vars for managed_node2 30583 1726853695.67625: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853695.67628: Calling groups_plugins_play to load vars for managed_node2 30583 1726853695.70486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853695.73968: done with get_vars() 30583 1726853695.74002: done getting variables 30583 1726853695.74130: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 13:34:55 -0400 (0:00:00.118) 0:00:31.078 ****** 30583 1726853695.74161: entering _queue_task() for managed_node2/debug 30583 1726853695.75109: worker is 1 (out of 1 available) 30583 1726853695.75119: exiting _queue_task() for managed_node2/debug 30583 1726853695.75129: done queuing things up, now waiting for results queue to drain 30583 1726853695.75131: waiting for pending results... 30583 1726853695.75633: running TaskExecutor() for managed_node2/TASK: Show item 30583 1726853695.75750: in run() - task 02083763-bbaf-05ea-abc5-000000000a4a 30583 1726853695.75759: variable 'ansible_search_path' from source: unknown 30583 1726853695.75767: variable 'ansible_search_path' from source: unknown 30583 1726853695.75830: variable 'omit' from source: magic vars 30583 1726853695.76278: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.76281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.76284: variable 'omit' from source: magic vars 30583 1726853695.77001: variable 'ansible_distribution_major_version' from source: facts 30583 1726853695.77017: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853695.77138: variable 'omit' from source: magic vars 30583 1726853695.77141: variable 'omit' from source: magic vars 30583 1726853695.77187: variable 'item' from source: unknown 30583 1726853695.77376: variable 'item' from source: unknown 30583 1726853695.77427: variable 'omit' from source: magic vars 30583 1726853695.77575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853695.77613: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853695.77699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853695.77725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.77755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.77895: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853695.77899: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.77901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.78122: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853695.78138: Set connection var ansible_timeout to 10 30583 1726853695.78221: Set connection var ansible_connection to ssh 30583 1726853695.78224: Set connection var ansible_shell_executable to /bin/sh 30583 1726853695.78226: Set connection var ansible_shell_type to sh 30583 1726853695.78228: Set connection var ansible_pipelining to False 30583 1726853695.78329: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.78332: variable 'ansible_connection' from source: unknown 30583 1726853695.78334: variable 'ansible_module_compression' from source: unknown 30583 1726853695.78336: variable 'ansible_shell_type' from source: unknown 30583 1726853695.78337: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.78339: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.78341: variable 'ansible_pipelining' from source: unknown 30583 1726853695.78342: variable 'ansible_timeout' from source: unknown 30583 1726853695.78344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.78662: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853695.78684: variable 'omit' from source: magic vars 30583 1726853695.78694: starting attempt loop 30583 1726853695.78724: running the handler 30583 1726853695.78875: variable 'lsr_description' from source: include params 30583 1726853695.78985: variable 'lsr_description' from source: include params 30583 1726853695.79003: handler run complete 30583 1726853695.79067: attempt loop complete, returning result 30583 1726853695.79201: variable 'item' from source: unknown 30583 1726853695.79263: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can activate an existing profile" } 30583 1726853695.79779: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.79783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.79785: variable 'omit' from source: magic vars 30583 1726853695.80097: variable 'ansible_distribution_major_version' from source: facts 30583 1726853695.80101: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853695.80104: variable 'omit' from source: magic vars 30583 1726853695.80106: variable 'omit' from source: magic vars 30583 1726853695.80108: variable 'item' from source: unknown 30583 1726853695.80203: variable 'item' from source: unknown 30583 1726853695.80239: variable 'omit' from source: magic vars 30583 1726853695.80295: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853695.80319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.80338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.80549: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853695.80552: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.80555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.80557: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853695.80559: Set connection var ansible_timeout to 10 30583 1726853695.80561: Set connection var ansible_connection to ssh 30583 1726853695.80563: Set connection var ansible_shell_executable to /bin/sh 30583 1726853695.80656: Set connection var ansible_shell_type to sh 30583 1726853695.80659: Set connection var ansible_pipelining to False 30583 1726853695.80683: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.80690: variable 'ansible_connection' from source: unknown 30583 1726853695.80697: variable 'ansible_module_compression' from source: unknown 30583 1726853695.80703: variable 'ansible_shell_type' from source: unknown 30583 1726853695.80709: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.80764: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.80767: variable 'ansible_pipelining' from source: unknown 30583 1726853695.80769: variable 'ansible_timeout' from source: unknown 30583 1726853695.80773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.81376: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853695.81379: variable 'omit' from source: magic vars 30583 1726853695.81383: starting attempt loop 30583 1726853695.81385: running the handler 30583 1726853695.81387: variable 'lsr_setup' from source: include params 30583 1726853695.81391: variable 'lsr_setup' from source: include params 30583 1726853695.81394: handler run complete 30583 1726853695.81397: attempt loop complete, returning result 30583 1726853695.81400: variable 'item' from source: unknown 30583 1726853695.81434: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml" ] } 30583 1726853695.81607: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.81614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.81639: variable 'omit' from source: magic vars 30583 1726853695.82175: variable 'ansible_distribution_major_version' from source: facts 30583 1726853695.82178: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853695.82181: variable 'omit' from source: magic vars 30583 1726853695.82183: variable 'omit' from source: magic vars 30583 1726853695.82185: variable 'item' from source: unknown 30583 1726853695.82376: variable 'item' from source: unknown 30583 1726853695.82379: variable 'omit' from source: magic vars 30583 1726853695.82382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853695.82384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.82387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.82390: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853695.82393: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.82396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.82452: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853695.82460: Set connection var ansible_timeout to 10 30583 1726853695.82462: Set connection var ansible_connection to ssh 30583 1726853695.82465: Set connection var ansible_shell_executable to /bin/sh 30583 1726853695.82467: Set connection var ansible_shell_type to sh 30583 1726853695.82477: Set connection var ansible_pipelining to False 30583 1726853695.82584: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.82587: variable 'ansible_connection' from source: unknown 30583 1726853695.82590: variable 'ansible_module_compression' from source: unknown 30583 1726853695.82592: variable 'ansible_shell_type' from source: unknown 30583 1726853695.82595: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.82597: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.82624: variable 'ansible_pipelining' from source: unknown 30583 1726853695.82627: variable 'ansible_timeout' from source: unknown 30583 1726853695.82631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.82772: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853695.82780: variable 'omit' from source: magic vars 30583 1726853695.82782: starting attempt loop 30583 1726853695.82785: running the handler 30583 1726853695.82804: variable 'lsr_test' from source: include params 30583 1726853695.83175: variable 'lsr_test' from source: include params 30583 1726853695.83179: handler run complete 30583 1726853695.83181: attempt loop complete, returning result 30583 1726853695.83183: variable 'item' from source: unknown 30583 1726853695.83277: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/activate_profile.yml" ] } 30583 1726853695.83358: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.83362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.83364: variable 'omit' from source: magic vars 30583 1726853695.83876: variable 'ansible_distribution_major_version' from source: facts 30583 1726853695.83879: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853695.83882: variable 'omit' from source: magic vars 30583 1726853695.83884: variable 'omit' from source: magic vars 30583 1726853695.83887: variable 'item' from source: unknown 30583 1726853695.84042: variable 'item' from source: unknown 30583 1726853695.84058: variable 'omit' from source: magic vars 30583 1726853695.84079: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853695.84086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.84092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.84103: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853695.84106: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.84108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.84298: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853695.84475: Set connection var ansible_timeout to 10 30583 1726853695.84478: Set connection var ansible_connection to ssh 30583 1726853695.84480: Set connection var ansible_shell_executable to /bin/sh 30583 1726853695.84483: Set connection var ansible_shell_type to sh 30583 1726853695.84485: Set connection var ansible_pipelining to False 30583 1726853695.84486: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.84488: variable 'ansible_connection' from source: unknown 30583 1726853695.84490: variable 'ansible_module_compression' from source: unknown 30583 1726853695.84492: variable 'ansible_shell_type' from source: unknown 30583 1726853695.84494: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.84496: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.84498: variable 'ansible_pipelining' from source: unknown 30583 1726853695.84499: variable 'ansible_timeout' from source: unknown 30583 1726853695.84501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.84561: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853695.84681: variable 'omit' from source: magic vars 30583 1726853695.84684: starting attempt loop 30583 1726853695.84687: running the handler 30583 1726853695.84708: variable 'lsr_assert' from source: include params 30583 1726853695.84773: variable 'lsr_assert' from source: include params 30583 1726853695.84902: handler run complete 30583 1726853695.84915: attempt loop complete, returning result 30583 1726853695.84928: variable 'item' from source: unknown 30583 1726853695.85061: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_present.yml" ] } 30583 1726853695.85266: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.85281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.85462: variable 'omit' from source: magic vars 30583 1726853695.85677: variable 'ansible_distribution_major_version' from source: facts 30583 1726853695.85794: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853695.85813: variable 'omit' from source: magic vars 30583 1726853695.85832: variable 'omit' from source: magic vars 30583 1726853695.85875: variable 'item' from source: unknown 30583 1726853695.86025: variable 'item' from source: unknown 30583 1726853695.86028: variable 'omit' from source: magic vars 30583 1726853695.86117: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853695.86135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.86146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.86162: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853695.86242: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.86245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.86366: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853695.86378: Set connection var ansible_timeout to 10 30583 1726853695.86385: Set connection var ansible_connection to ssh 30583 1726853695.86394: Set connection var ansible_shell_executable to /bin/sh 30583 1726853695.86458: Set connection var ansible_shell_type to sh 30583 1726853695.86461: Set connection var ansible_pipelining to False 30583 1726853695.86483: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.86654: variable 'ansible_connection' from source: unknown 30583 1726853695.86657: variable 'ansible_module_compression' from source: unknown 30583 1726853695.86659: variable 'ansible_shell_type' from source: unknown 30583 1726853695.86661: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.86663: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.86665: variable 'ansible_pipelining' from source: unknown 30583 1726853695.86667: variable 'ansible_timeout' from source: unknown 30583 1726853695.86672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.86786: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853695.86789: variable 'omit' from source: magic vars 30583 1726853695.86791: starting attempt loop 30583 1726853695.86793: running the handler 30583 1726853695.87086: handler run complete 30583 1726853695.87089: attempt loop complete, returning result 30583 1726853695.87091: variable 'item' from source: unknown 30583 1726853695.87186: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30583 1726853695.87576: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.87579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.87582: variable 'omit' from source: magic vars 30583 1726853695.87759: variable 'ansible_distribution_major_version' from source: facts 30583 1726853695.87769: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853695.87778: variable 'omit' from source: magic vars 30583 1726853695.87957: variable 'omit' from source: magic vars 30583 1726853695.87960: variable 'item' from source: unknown 30583 1726853695.87962: variable 'item' from source: unknown 30583 1726853695.88082: variable 'omit' from source: magic vars 30583 1726853695.88103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853695.88114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.88123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.88137: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853695.88182: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.88189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.88259: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853695.88478: Set connection var ansible_timeout to 10 30583 1726853695.88481: Set connection var ansible_connection to ssh 30583 1726853695.88483: Set connection var ansible_shell_executable to /bin/sh 30583 1726853695.88485: Set connection var ansible_shell_type to sh 30583 1726853695.88487: Set connection var ansible_pipelining to False 30583 1726853695.88488: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.88490: variable 'ansible_connection' from source: unknown 30583 1726853695.88492: variable 'ansible_module_compression' from source: unknown 30583 1726853695.88495: variable 'ansible_shell_type' from source: unknown 30583 1726853695.88497: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.88499: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.88501: variable 'ansible_pipelining' from source: unknown 30583 1726853695.88503: variable 'ansible_timeout' from source: unknown 30583 1726853695.88504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.88647: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853695.88704: variable 'omit' from source: magic vars 30583 1726853695.88712: starting attempt loop 30583 1726853695.88717: running the handler 30583 1726853695.88946: variable 'lsr_fail_debug' from source: play vars 30583 1726853695.88949: variable 'lsr_fail_debug' from source: play vars 30583 1726853695.88951: handler run complete 30583 1726853695.88982: attempt loop complete, returning result 30583 1726853695.89000: variable 'item' from source: unknown 30583 1726853695.89113: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30583 1726853695.89421: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.89435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.89447: variable 'omit' from source: magic vars 30583 1726853695.89737: variable 'ansible_distribution_major_version' from source: facts 30583 1726853695.89752: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853695.89760: variable 'omit' from source: magic vars 30583 1726853695.89827: variable 'omit' from source: magic vars 30583 1726853695.89874: variable 'item' from source: unknown 30583 1726853695.90074: variable 'item' from source: unknown 30583 1726853695.90077: variable 'omit' from source: magic vars 30583 1726853695.90086: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853695.90177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.90180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853695.90187: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853695.90189: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.90191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.90475: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853695.90478: Set connection var ansible_timeout to 10 30583 1726853695.90480: Set connection var ansible_connection to ssh 30583 1726853695.90482: Set connection var ansible_shell_executable to /bin/sh 30583 1726853695.90484: Set connection var ansible_shell_type to sh 30583 1726853695.90486: Set connection var ansible_pipelining to False 30583 1726853695.90487: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.90489: variable 'ansible_connection' from source: unknown 30583 1726853695.90491: variable 'ansible_module_compression' from source: unknown 30583 1726853695.90492: variable 'ansible_shell_type' from source: unknown 30583 1726853695.90494: variable 'ansible_shell_executable' from source: unknown 30583 1726853695.90500: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853695.90503: variable 'ansible_pipelining' from source: unknown 30583 1726853695.90504: variable 'ansible_timeout' from source: unknown 30583 1726853695.90506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853695.90648: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853695.90940: variable 'omit' from source: magic vars 30583 1726853695.90943: starting attempt loop 30583 1726853695.90946: running the handler 30583 1726853695.90948: variable 'lsr_cleanup' from source: include params 30583 1726853695.90950: variable 'lsr_cleanup' from source: include params 30583 1726853695.91047: handler run complete 30583 1726853695.91050: attempt loop complete, returning result 30583 1726853695.91053: variable 'item' from source: unknown 30583 1726853695.91157: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30583 1726853695.91277: dumping result to json 30583 1726853695.91280: done dumping result, returning 30583 1726853695.91292: done running TaskExecutor() for managed_node2/TASK: Show item [02083763-bbaf-05ea-abc5-000000000a4a] 30583 1726853695.91476: sending task result for task 02083763-bbaf-05ea-abc5-000000000a4a 30583 1726853695.91527: done sending task result for task 02083763-bbaf-05ea-abc5-000000000a4a 30583 1726853695.91530: WORKER PROCESS EXITING 30583 1726853695.91584: no more pending results, returning what we have 30583 1726853695.91588: results queue empty 30583 1726853695.91589: checking for any_errors_fatal 30583 1726853695.91598: done checking for any_errors_fatal 30583 1726853695.91599: checking for max_fail_percentage 30583 1726853695.91601: done checking for max_fail_percentage 30583 1726853695.91602: checking to see if all hosts have failed and the running result is not ok 30583 1726853695.91603: done checking to see if all hosts have failed 30583 1726853695.91603: getting the remaining hosts for this loop 30583 1726853695.91605: done getting the remaining hosts for this loop 30583 1726853695.91609: getting the next task for host managed_node2 30583 1726853695.91616: done getting next task for host managed_node2 30583 1726853695.91620: ^ task is: TASK: Include the task 'show_interfaces.yml' 30583 1726853695.91624: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853695.91628: getting variables 30583 1726853695.91630: in VariableManager get_vars() 30583 1726853695.91668: Calling all_inventory to load vars for managed_node2 30583 1726853695.91672: Calling groups_inventory to load vars for managed_node2 30583 1726853695.91677: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853695.91688: Calling all_plugins_play to load vars for managed_node2 30583 1726853695.91692: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853695.91695: Calling groups_plugins_play to load vars for managed_node2 30583 1726853695.95036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853695.98213: done with get_vars() 30583 1726853695.98248: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 13:34:55 -0400 (0:00:00.242) 0:00:31.321 ****** 30583 1726853695.98447: entering _queue_task() for managed_node2/include_tasks 30583 1726853695.99239: worker is 1 (out of 1 available) 30583 1726853695.99251: exiting _queue_task() for managed_node2/include_tasks 30583 1726853695.99488: done queuing things up, now waiting for results queue to drain 30583 1726853695.99490: waiting for pending results... 30583 1726853695.99936: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 30583 1726853696.00081: in run() - task 02083763-bbaf-05ea-abc5-000000000a4b 30583 1726853696.00358: variable 'ansible_search_path' from source: unknown 30583 1726853696.00362: variable 'ansible_search_path' from source: unknown 30583 1726853696.00365: calling self._execute() 30583 1726853696.00418: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853696.00480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853696.00496: variable 'omit' from source: magic vars 30583 1726853696.01276: variable 'ansible_distribution_major_version' from source: facts 30583 1726853696.01280: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853696.01282: _execute() done 30583 1726853696.01284: dumping result to json 30583 1726853696.01286: done dumping result, returning 30583 1726853696.01339: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-05ea-abc5-000000000a4b] 30583 1726853696.01446: sending task result for task 02083763-bbaf-05ea-abc5-000000000a4b 30583 1726853696.01576: done sending task result for task 02083763-bbaf-05ea-abc5-000000000a4b 30583 1726853696.01579: WORKER PROCESS EXITING 30583 1726853696.01612: no more pending results, returning what we have 30583 1726853696.01618: in VariableManager get_vars() 30583 1726853696.01659: Calling all_inventory to load vars for managed_node2 30583 1726853696.01662: Calling groups_inventory to load vars for managed_node2 30583 1726853696.01672: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853696.01687: Calling all_plugins_play to load vars for managed_node2 30583 1726853696.01691: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853696.01694: Calling groups_plugins_play to load vars for managed_node2 30583 1726853696.04693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853696.08019: done with get_vars() 30583 1726853696.08046: variable 'ansible_search_path' from source: unknown 30583 1726853696.08048: variable 'ansible_search_path' from source: unknown 30583 1726853696.08092: we have included files to process 30583 1726853696.08094: generating all_blocks data 30583 1726853696.08096: done generating all_blocks data 30583 1726853696.08101: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853696.08103: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853696.08105: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853696.08381: in VariableManager get_vars() 30583 1726853696.08401: done with get_vars() 30583 1726853696.08626: done processing included file 30583 1726853696.08628: iterating over new_blocks loaded from include file 30583 1726853696.08629: in VariableManager get_vars() 30583 1726853696.08643: done with get_vars() 30583 1726853696.08645: filtering new block on tags 30583 1726853696.08791: done filtering new block on tags 30583 1726853696.08794: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 30583 1726853696.08800: extending task lists for all hosts with included blocks 30583 1726853696.09741: done extending task lists 30583 1726853696.09743: done processing included files 30583 1726853696.09743: results queue empty 30583 1726853696.09744: checking for any_errors_fatal 30583 1726853696.09751: done checking for any_errors_fatal 30583 1726853696.09752: checking for max_fail_percentage 30583 1726853696.09753: done checking for max_fail_percentage 30583 1726853696.09753: checking to see if all hosts have failed and the running result is not ok 30583 1726853696.09754: done checking to see if all hosts have failed 30583 1726853696.09755: getting the remaining hosts for this loop 30583 1726853696.09756: done getting the remaining hosts for this loop 30583 1726853696.09759: getting the next task for host managed_node2 30583 1726853696.09763: done getting next task for host managed_node2 30583 1726853696.09765: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30583 1726853696.09768: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853696.09773: getting variables 30583 1726853696.09774: in VariableManager get_vars() 30583 1726853696.09784: Calling all_inventory to load vars for managed_node2 30583 1726853696.09786: Calling groups_inventory to load vars for managed_node2 30583 1726853696.09874: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853696.09882: Calling all_plugins_play to load vars for managed_node2 30583 1726853696.09884: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853696.09886: Calling groups_plugins_play to load vars for managed_node2 30583 1726853696.12508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853696.15634: done with get_vars() 30583 1726853696.15664: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:34:56 -0400 (0:00:00.174) 0:00:31.495 ****** 30583 1726853696.15859: entering _queue_task() for managed_node2/include_tasks 30583 1726853696.16698: worker is 1 (out of 1 available) 30583 1726853696.16711: exiting _queue_task() for managed_node2/include_tasks 30583 1726853696.16723: done queuing things up, now waiting for results queue to drain 30583 1726853696.16725: waiting for pending results... 30583 1726853696.17439: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 30583 1726853696.17444: in run() - task 02083763-bbaf-05ea-abc5-000000000a72 30583 1726853696.17447: variable 'ansible_search_path' from source: unknown 30583 1726853696.17450: variable 'ansible_search_path' from source: unknown 30583 1726853696.17530: calling self._execute() 30583 1726853696.17604: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853696.17608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853696.17620: variable 'omit' from source: magic vars 30583 1726853696.18968: variable 'ansible_distribution_major_version' from source: facts 30583 1726853696.18975: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853696.18979: _execute() done 30583 1726853696.18982: dumping result to json 30583 1726853696.18984: done dumping result, returning 30583 1726853696.18986: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-05ea-abc5-000000000a72] 30583 1726853696.18988: sending task result for task 02083763-bbaf-05ea-abc5-000000000a72 30583 1726853696.19140: done sending task result for task 02083763-bbaf-05ea-abc5-000000000a72 30583 1726853696.19142: WORKER PROCESS EXITING 30583 1726853696.19170: no more pending results, returning what we have 30583 1726853696.19181: in VariableManager get_vars() 30583 1726853696.19217: Calling all_inventory to load vars for managed_node2 30583 1726853696.19220: Calling groups_inventory to load vars for managed_node2 30583 1726853696.19223: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853696.19236: Calling all_plugins_play to load vars for managed_node2 30583 1726853696.19239: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853696.19241: Calling groups_plugins_play to load vars for managed_node2 30583 1726853696.22609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853696.26251: done with get_vars() 30583 1726853696.26273: variable 'ansible_search_path' from source: unknown 30583 1726853696.26275: variable 'ansible_search_path' from source: unknown 30583 1726853696.26386: we have included files to process 30583 1726853696.26388: generating all_blocks data 30583 1726853696.26389: done generating all_blocks data 30583 1726853696.26391: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853696.26392: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853696.26394: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853696.26935: done processing included file 30583 1726853696.26937: iterating over new_blocks loaded from include file 30583 1726853696.26939: in VariableManager get_vars() 30583 1726853696.26953: done with get_vars() 30583 1726853696.26955: filtering new block on tags 30583 1726853696.27094: done filtering new block on tags 30583 1726853696.27097: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 30583 1726853696.27102: extending task lists for all hosts with included blocks 30583 1726853696.27438: done extending task lists 30583 1726853696.27439: done processing included files 30583 1726853696.27440: results queue empty 30583 1726853696.27441: checking for any_errors_fatal 30583 1726853696.27444: done checking for any_errors_fatal 30583 1726853696.27445: checking for max_fail_percentage 30583 1726853696.27446: done checking for max_fail_percentage 30583 1726853696.27447: checking to see if all hosts have failed and the running result is not ok 30583 1726853696.27448: done checking to see if all hosts have failed 30583 1726853696.27449: getting the remaining hosts for this loop 30583 1726853696.27450: done getting the remaining hosts for this loop 30583 1726853696.27453: getting the next task for host managed_node2 30583 1726853696.27457: done getting next task for host managed_node2 30583 1726853696.27459: ^ task is: TASK: Gather current interface info 30583 1726853696.27463: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853696.27465: getting variables 30583 1726853696.27466: in VariableManager get_vars() 30583 1726853696.27478: Calling all_inventory to load vars for managed_node2 30583 1726853696.27480: Calling groups_inventory to load vars for managed_node2 30583 1726853696.27482: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853696.27487: Calling all_plugins_play to load vars for managed_node2 30583 1726853696.27489: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853696.27491: Calling groups_plugins_play to load vars for managed_node2 30583 1726853696.29882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853696.33129: done with get_vars() 30583 1726853696.33187: done getting variables 30583 1726853696.33234: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:34:56 -0400 (0:00:00.174) 0:00:31.669 ****** 30583 1726853696.33269: entering _queue_task() for managed_node2/command 30583 1726853696.33709: worker is 1 (out of 1 available) 30583 1726853696.33721: exiting _queue_task() for managed_node2/command 30583 1726853696.33731: done queuing things up, now waiting for results queue to drain 30583 1726853696.33732: waiting for pending results... 30583 1726853696.34050: running TaskExecutor() for managed_node2/TASK: Gather current interface info 30583 1726853696.34085: in run() - task 02083763-bbaf-05ea-abc5-000000000aad 30583 1726853696.34107: variable 'ansible_search_path' from source: unknown 30583 1726853696.34116: variable 'ansible_search_path' from source: unknown 30583 1726853696.34163: calling self._execute() 30583 1726853696.34270: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853696.34288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853696.34364: variable 'omit' from source: magic vars 30583 1726853696.34699: variable 'ansible_distribution_major_version' from source: facts 30583 1726853696.34720: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853696.34731: variable 'omit' from source: magic vars 30583 1726853696.34786: variable 'omit' from source: magic vars 30583 1726853696.34835: variable 'omit' from source: magic vars 30583 1726853696.34912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853696.35028: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853696.35034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853696.35054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853696.35073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853696.35448: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853696.35452: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853696.35454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853696.35507: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853696.35521: Set connection var ansible_timeout to 10 30583 1726853696.35529: Set connection var ansible_connection to ssh 30583 1726853696.35539: Set connection var ansible_shell_executable to /bin/sh 30583 1726853696.35546: Set connection var ansible_shell_type to sh 30583 1726853696.35567: Set connection var ansible_pipelining to False 30583 1726853696.35621: variable 'ansible_shell_executable' from source: unknown 30583 1726853696.35648: variable 'ansible_connection' from source: unknown 30583 1726853696.35656: variable 'ansible_module_compression' from source: unknown 30583 1726853696.35670: variable 'ansible_shell_type' from source: unknown 30583 1726853696.35682: variable 'ansible_shell_executable' from source: unknown 30583 1726853696.35694: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853696.35703: variable 'ansible_pipelining' from source: unknown 30583 1726853696.35709: variable 'ansible_timeout' from source: unknown 30583 1726853696.35717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853696.35869: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853696.35907: variable 'omit' from source: magic vars 30583 1726853696.35910: starting attempt loop 30583 1726853696.35913: running the handler 30583 1726853696.35993: _low_level_execute_command(): starting 30583 1726853696.35997: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853696.36780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853696.36800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853696.37017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853696.37201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853696.38880: stdout chunk (state=3): >>>/root <<< 30583 1726853696.39147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853696.39150: stdout chunk (state=3): >>><<< 30583 1726853696.39153: stderr chunk (state=3): >>><<< 30583 1726853696.39159: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853696.39162: _low_level_execute_command(): starting 30583 1726853696.39164: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711 `" && echo ansible-tmp-1726853696.3905406-32053-10427290341711="` echo /root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711 `" ) && sleep 0' 30583 1726853696.40220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853696.40224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853696.40227: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853696.40238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853696.40389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853696.40401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853696.40498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853696.42527: stdout chunk (state=3): >>>ansible-tmp-1726853696.3905406-32053-10427290341711=/root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711 <<< 30583 1726853696.42641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853696.42674: stderr chunk (state=3): >>><<< 30583 1726853696.42682: stdout chunk (state=3): >>><<< 30583 1726853696.42721: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853696.3905406-32053-10427290341711=/root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853696.42925: variable 'ansible_module_compression' from source: unknown 30583 1726853696.42928: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853696.43043: variable 'ansible_facts' from source: unknown 30583 1726853696.43136: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711/AnsiballZ_command.py 30583 1726853696.43385: Sending initial data 30583 1726853696.43389: Sent initial data (155 bytes) 30583 1726853696.44188: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853696.44394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853696.44490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853696.44537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853696.44601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853696.46296: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30583 1726853696.46311: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853696.46413: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853696.46496: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpfj2efrh8 /root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711/AnsiballZ_command.py <<< 30583 1726853696.46499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711/AnsiballZ_command.py" <<< 30583 1726853696.46977: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpfj2efrh8" to remote "/root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711/AnsiballZ_command.py" <<< 30583 1726853696.47793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853696.47797: stdout chunk (state=3): >>><<< 30583 1726853696.47804: stderr chunk (state=3): >>><<< 30583 1726853696.47875: done transferring module to remote 30583 1726853696.47888: _low_level_execute_command(): starting 30583 1726853696.47891: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711/ /root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711/AnsiballZ_command.py && sleep 0' 30583 1726853696.48587: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853696.48629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853696.48648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853696.48678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853696.48785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853696.50986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853696.50990: stdout chunk (state=3): >>><<< 30583 1726853696.51150: stderr chunk (state=3): >>><<< 30583 1726853696.51154: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853696.51160: _low_level_execute_command(): starting 30583 1726853696.51162: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711/AnsiballZ_command.py && sleep 0' 30583 1726853696.52352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853696.52464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853696.52637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853696.52791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853696.68746: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:34:56.682953", "end": "2024-09-20 13:34:56.686413", "delta": "0:00:00.003460", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853696.70391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853696.70407: stdout chunk (state=3): >>><<< 30583 1726853696.70419: stderr chunk (state=3): >>><<< 30583 1726853696.70445: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:34:56.682953", "end": "2024-09-20 13:34:56.686413", "delta": "0:00:00.003460", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853696.70491: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853696.70508: _low_level_execute_command(): starting 30583 1726853696.70518: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853696.3905406-32053-10427290341711/ > /dev/null 2>&1 && sleep 0' 30583 1726853696.71167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853696.71191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853696.71208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853696.71231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853696.71250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853696.71291: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853696.71364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853696.71388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853696.71501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853696.71564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853696.73442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853696.73477: stderr chunk (state=3): >>><<< 30583 1726853696.73480: stdout chunk (state=3): >>><<< 30583 1726853696.73489: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853696.73495: handler run complete 30583 1726853696.73514: Evaluated conditional (False): False 30583 1726853696.73527: attempt loop complete, returning result 30583 1726853696.73530: _execute() done 30583 1726853696.73532: dumping result to json 30583 1726853696.73534: done dumping result, returning 30583 1726853696.73540: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [02083763-bbaf-05ea-abc5-000000000aad] 30583 1726853696.73545: sending task result for task 02083763-bbaf-05ea-abc5-000000000aad 30583 1726853696.73648: done sending task result for task 02083763-bbaf-05ea-abc5-000000000aad 30583 1726853696.73651: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003460", "end": "2024-09-20 13:34:56.686413", "rc": 0, "start": "2024-09-20 13:34:56.682953" } STDOUT: bonding_masters eth0 lo 30583 1726853696.73758: no more pending results, returning what we have 30583 1726853696.73766: results queue empty 30583 1726853696.73768: checking for any_errors_fatal 30583 1726853696.73769: done checking for any_errors_fatal 30583 1726853696.73770: checking for max_fail_percentage 30583 1726853696.73774: done checking for max_fail_percentage 30583 1726853696.73775: checking to see if all hosts have failed and the running result is not ok 30583 1726853696.73775: done checking to see if all hosts have failed 30583 1726853696.73776: getting the remaining hosts for this loop 30583 1726853696.73778: done getting the remaining hosts for this loop 30583 1726853696.73782: getting the next task for host managed_node2 30583 1726853696.73789: done getting next task for host managed_node2 30583 1726853696.73792: ^ task is: TASK: Set current_interfaces 30583 1726853696.73796: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853696.73801: getting variables 30583 1726853696.73802: in VariableManager get_vars() 30583 1726853696.73833: Calling all_inventory to load vars for managed_node2 30583 1726853696.73835: Calling groups_inventory to load vars for managed_node2 30583 1726853696.73838: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853696.73849: Calling all_plugins_play to load vars for managed_node2 30583 1726853696.73851: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853696.73854: Calling groups_plugins_play to load vars for managed_node2 30583 1726853696.75931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853696.83657: done with get_vars() 30583 1726853696.83689: done getting variables 30583 1726853696.83745: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:34:56 -0400 (0:00:00.505) 0:00:32.174 ****** 30583 1726853696.83777: entering _queue_task() for managed_node2/set_fact 30583 1726853696.84211: worker is 1 (out of 1 available) 30583 1726853696.84224: exiting _queue_task() for managed_node2/set_fact 30583 1726853696.84380: done queuing things up, now waiting for results queue to drain 30583 1726853696.84382: waiting for pending results... 30583 1726853696.84600: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 30583 1726853696.84712: in run() - task 02083763-bbaf-05ea-abc5-000000000aae 30583 1726853696.84723: variable 'ansible_search_path' from source: unknown 30583 1726853696.84732: variable 'ansible_search_path' from source: unknown 30583 1726853696.84780: calling self._execute() 30583 1726853696.84899: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853696.84915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853696.84942: variable 'omit' from source: magic vars 30583 1726853696.85475: variable 'ansible_distribution_major_version' from source: facts 30583 1726853696.85479: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853696.85482: variable 'omit' from source: magic vars 30583 1726853696.85484: variable 'omit' from source: magic vars 30583 1726853696.85750: variable '_current_interfaces' from source: set_fact 30583 1726853696.85840: variable 'omit' from source: magic vars 30583 1726853696.85949: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853696.85953: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853696.85985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853696.86007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853696.86054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853696.86083: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853696.86092: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853696.86100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853696.86275: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853696.86284: Set connection var ansible_timeout to 10 30583 1726853696.86290: Set connection var ansible_connection to ssh 30583 1726853696.86292: Set connection var ansible_shell_executable to /bin/sh 30583 1726853696.86295: Set connection var ansible_shell_type to sh 30583 1726853696.86297: Set connection var ansible_pipelining to False 30583 1726853696.86305: variable 'ansible_shell_executable' from source: unknown 30583 1726853696.86312: variable 'ansible_connection' from source: unknown 30583 1726853696.86318: variable 'ansible_module_compression' from source: unknown 30583 1726853696.86325: variable 'ansible_shell_type' from source: unknown 30583 1726853696.86332: variable 'ansible_shell_executable' from source: unknown 30583 1726853696.86338: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853696.86345: variable 'ansible_pipelining' from source: unknown 30583 1726853696.86351: variable 'ansible_timeout' from source: unknown 30583 1726853696.86358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853696.86596: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853696.86599: variable 'omit' from source: magic vars 30583 1726853696.86601: starting attempt loop 30583 1726853696.86604: running the handler 30583 1726853696.86608: handler run complete 30583 1726853696.86610: attempt loop complete, returning result 30583 1726853696.86620: _execute() done 30583 1726853696.86622: dumping result to json 30583 1726853696.86624: done dumping result, returning 30583 1726853696.86637: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [02083763-bbaf-05ea-abc5-000000000aae] 30583 1726853696.86646: sending task result for task 02083763-bbaf-05ea-abc5-000000000aae ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30583 1726853696.87076: no more pending results, returning what we have 30583 1726853696.87079: results queue empty 30583 1726853696.87080: checking for any_errors_fatal 30583 1726853696.87087: done checking for any_errors_fatal 30583 1726853696.87088: checking for max_fail_percentage 30583 1726853696.87090: done checking for max_fail_percentage 30583 1726853696.87091: checking to see if all hosts have failed and the running result is not ok 30583 1726853696.87091: done checking to see if all hosts have failed 30583 1726853696.87092: getting the remaining hosts for this loop 30583 1726853696.87093: done getting the remaining hosts for this loop 30583 1726853696.87099: getting the next task for host managed_node2 30583 1726853696.87110: done getting next task for host managed_node2 30583 1726853696.87114: ^ task is: TASK: Show current_interfaces 30583 1726853696.87117: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853696.87121: getting variables 30583 1726853696.87122: in VariableManager get_vars() 30583 1726853696.87155: Calling all_inventory to load vars for managed_node2 30583 1726853696.87158: Calling groups_inventory to load vars for managed_node2 30583 1726853696.87165: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853696.87184: Calling all_plugins_play to load vars for managed_node2 30583 1726853696.87188: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853696.87191: Calling groups_plugins_play to load vars for managed_node2 30583 1726853696.87787: done sending task result for task 02083763-bbaf-05ea-abc5-000000000aae 30583 1726853696.87790: WORKER PROCESS EXITING 30583 1726853696.88691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853696.90486: done with get_vars() 30583 1726853696.90507: done getting variables 30583 1726853696.90817: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:34:56 -0400 (0:00:00.070) 0:00:32.245 ****** 30583 1726853696.90850: entering _queue_task() for managed_node2/debug 30583 1726853696.91477: worker is 1 (out of 1 available) 30583 1726853696.91489: exiting _queue_task() for managed_node2/debug 30583 1726853696.91501: done queuing things up, now waiting for results queue to drain 30583 1726853696.91502: waiting for pending results... 30583 1726853696.91655: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 30583 1726853696.92018: in run() - task 02083763-bbaf-05ea-abc5-000000000a73 30583 1726853696.92031: variable 'ansible_search_path' from source: unknown 30583 1726853696.92034: variable 'ansible_search_path' from source: unknown 30583 1726853696.92076: calling self._execute() 30583 1726853696.92386: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853696.92411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853696.92415: variable 'omit' from source: magic vars 30583 1726853696.93215: variable 'ansible_distribution_major_version' from source: facts 30583 1726853696.93278: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853696.93282: variable 'omit' from source: magic vars 30583 1726853696.93333: variable 'omit' from source: magic vars 30583 1726853696.93465: variable 'current_interfaces' from source: set_fact 30583 1726853696.93496: variable 'omit' from source: magic vars 30583 1726853696.93536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853696.93574: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853696.93595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853696.93624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853696.93627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853696.93672: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853696.93676: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853696.93679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853696.93821: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853696.93825: Set connection var ansible_timeout to 10 30583 1726853696.93827: Set connection var ansible_connection to ssh 30583 1726853696.93829: Set connection var ansible_shell_executable to /bin/sh 30583 1726853696.93929: Set connection var ansible_shell_type to sh 30583 1726853696.93934: Set connection var ansible_pipelining to False 30583 1726853696.93937: variable 'ansible_shell_executable' from source: unknown 30583 1726853696.93945: variable 'ansible_connection' from source: unknown 30583 1726853696.93948: variable 'ansible_module_compression' from source: unknown 30583 1726853696.93951: variable 'ansible_shell_type' from source: unknown 30583 1726853696.93954: variable 'ansible_shell_executable' from source: unknown 30583 1726853696.93956: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853696.93958: variable 'ansible_pipelining' from source: unknown 30583 1726853696.93960: variable 'ansible_timeout' from source: unknown 30583 1726853696.93962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853696.94041: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853696.94045: variable 'omit' from source: magic vars 30583 1726853696.94047: starting attempt loop 30583 1726853696.94054: running the handler 30583 1726853696.94074: handler run complete 30583 1726853696.94085: attempt loop complete, returning result 30583 1726853696.94088: _execute() done 30583 1726853696.94091: dumping result to json 30583 1726853696.94094: done dumping result, returning 30583 1726853696.94149: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [02083763-bbaf-05ea-abc5-000000000a73] 30583 1726853696.94152: sending task result for task 02083763-bbaf-05ea-abc5-000000000a73 ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30583 1726853696.94339: no more pending results, returning what we have 30583 1726853696.94342: results queue empty 30583 1726853696.94343: checking for any_errors_fatal 30583 1726853696.94348: done checking for any_errors_fatal 30583 1726853696.94348: checking for max_fail_percentage 30583 1726853696.94350: done checking for max_fail_percentage 30583 1726853696.94351: checking to see if all hosts have failed and the running result is not ok 30583 1726853696.94351: done checking to see if all hosts have failed 30583 1726853696.94352: getting the remaining hosts for this loop 30583 1726853696.94353: done getting the remaining hosts for this loop 30583 1726853696.94356: getting the next task for host managed_node2 30583 1726853696.94369: done getting next task for host managed_node2 30583 1726853696.94376: ^ task is: TASK: Setup 30583 1726853696.94378: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853696.94382: getting variables 30583 1726853696.94383: in VariableManager get_vars() 30583 1726853696.94410: Calling all_inventory to load vars for managed_node2 30583 1726853696.94412: Calling groups_inventory to load vars for managed_node2 30583 1726853696.94415: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853696.94424: Calling all_plugins_play to load vars for managed_node2 30583 1726853696.94426: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853696.94429: Calling groups_plugins_play to load vars for managed_node2 30583 1726853696.94985: done sending task result for task 02083763-bbaf-05ea-abc5-000000000a73 30583 1726853696.94988: WORKER PROCESS EXITING 30583 1726853696.96008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853696.97639: done with get_vars() 30583 1726853696.97665: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 13:34:56 -0400 (0:00:00.069) 0:00:32.314 ****** 30583 1726853696.97773: entering _queue_task() for managed_node2/include_tasks 30583 1726853696.98130: worker is 1 (out of 1 available) 30583 1726853696.98143: exiting _queue_task() for managed_node2/include_tasks 30583 1726853696.98156: done queuing things up, now waiting for results queue to drain 30583 1726853696.98157: waiting for pending results... 30583 1726853696.98458: running TaskExecutor() for managed_node2/TASK: Setup 30583 1726853696.98582: in run() - task 02083763-bbaf-05ea-abc5-000000000a4c 30583 1726853696.98610: variable 'ansible_search_path' from source: unknown 30583 1726853696.98617: variable 'ansible_search_path' from source: unknown 30583 1726853696.98666: variable 'lsr_setup' from source: include params 30583 1726853696.98896: variable 'lsr_setup' from source: include params 30583 1726853696.98979: variable 'omit' from source: magic vars 30583 1726853696.99123: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853696.99180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853696.99184: variable 'omit' from source: magic vars 30583 1726853696.99434: variable 'ansible_distribution_major_version' from source: facts 30583 1726853696.99449: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853696.99460: variable 'item' from source: unknown 30583 1726853696.99580: variable 'item' from source: unknown 30583 1726853696.99584: variable 'item' from source: unknown 30583 1726853696.99636: variable 'item' from source: unknown 30583 1726853696.99986: dumping result to json 30583 1726853696.99990: done dumping result, returning 30583 1726853696.99992: done running TaskExecutor() for managed_node2/TASK: Setup [02083763-bbaf-05ea-abc5-000000000a4c] 30583 1726853696.99994: sending task result for task 02083763-bbaf-05ea-abc5-000000000a4c 30583 1726853697.00033: done sending task result for task 02083763-bbaf-05ea-abc5-000000000a4c 30583 1726853697.00036: WORKER PROCESS EXITING 30583 1726853697.00108: no more pending results, returning what we have 30583 1726853697.00112: in VariableManager get_vars() 30583 1726853697.00141: Calling all_inventory to load vars for managed_node2 30583 1726853697.00144: Calling groups_inventory to load vars for managed_node2 30583 1726853697.00147: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853697.00158: Calling all_plugins_play to load vars for managed_node2 30583 1726853697.00161: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853697.00164: Calling groups_plugins_play to load vars for managed_node2 30583 1726853697.01553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853697.03190: done with get_vars() 30583 1726853697.03210: variable 'ansible_search_path' from source: unknown 30583 1726853697.03212: variable 'ansible_search_path' from source: unknown 30583 1726853697.03257: we have included files to process 30583 1726853697.03258: generating all_blocks data 30583 1726853697.03260: done generating all_blocks data 30583 1726853697.03265: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853697.03266: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853697.03269: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853697.03517: done processing included file 30583 1726853697.03520: iterating over new_blocks loaded from include file 30583 1726853697.03521: in VariableManager get_vars() 30583 1726853697.03536: done with get_vars() 30583 1726853697.03537: filtering new block on tags 30583 1726853697.03581: done filtering new block on tags 30583 1726853697.03584: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 30583 1726853697.03589: extending task lists for all hosts with included blocks 30583 1726853697.04238: done extending task lists 30583 1726853697.04240: done processing included files 30583 1726853697.04241: results queue empty 30583 1726853697.04241: checking for any_errors_fatal 30583 1726853697.04245: done checking for any_errors_fatal 30583 1726853697.04246: checking for max_fail_percentage 30583 1726853697.04247: done checking for max_fail_percentage 30583 1726853697.04248: checking to see if all hosts have failed and the running result is not ok 30583 1726853697.04249: done checking to see if all hosts have failed 30583 1726853697.04250: getting the remaining hosts for this loop 30583 1726853697.04251: done getting the remaining hosts for this loop 30583 1726853697.04253: getting the next task for host managed_node2 30583 1726853697.04257: done getting next task for host managed_node2 30583 1726853697.04260: ^ task is: TASK: Include network role 30583 1726853697.04262: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853697.04265: getting variables 30583 1726853697.04266: in VariableManager get_vars() 30583 1726853697.04277: Calling all_inventory to load vars for managed_node2 30583 1726853697.04279: Calling groups_inventory to load vars for managed_node2 30583 1726853697.04281: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853697.04287: Calling all_plugins_play to load vars for managed_node2 30583 1726853697.04290: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853697.04293: Calling groups_plugins_play to load vars for managed_node2 30583 1726853697.05551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853697.07114: done with get_vars() 30583 1726853697.07142: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 13:34:57 -0400 (0:00:00.094) 0:00:32.409 ****** 30583 1726853697.07221: entering _queue_task() for managed_node2/include_role 30583 1726853697.07790: worker is 1 (out of 1 available) 30583 1726853697.07800: exiting _queue_task() for managed_node2/include_role 30583 1726853697.07810: done queuing things up, now waiting for results queue to drain 30583 1726853697.07811: waiting for pending results... 30583 1726853697.07945: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853697.08152: in run() - task 02083763-bbaf-05ea-abc5-000000000ad1 30583 1726853697.08156: variable 'ansible_search_path' from source: unknown 30583 1726853697.08159: variable 'ansible_search_path' from source: unknown 30583 1726853697.08162: calling self._execute() 30583 1726853697.08219: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853697.08229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853697.08245: variable 'omit' from source: magic vars 30583 1726853697.08642: variable 'ansible_distribution_major_version' from source: facts 30583 1726853697.08659: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853697.08669: _execute() done 30583 1726853697.08685: dumping result to json 30583 1726853697.08700: done dumping result, returning 30583 1726853697.08712: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-000000000ad1] 30583 1726853697.08721: sending task result for task 02083763-bbaf-05ea-abc5-000000000ad1 30583 1726853697.08957: no more pending results, returning what we have 30583 1726853697.08963: in VariableManager get_vars() 30583 1726853697.09002: Calling all_inventory to load vars for managed_node2 30583 1726853697.09005: Calling groups_inventory to load vars for managed_node2 30583 1726853697.09019: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853697.09035: Calling all_plugins_play to load vars for managed_node2 30583 1726853697.09039: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853697.09042: Calling groups_plugins_play to load vars for managed_node2 30583 1726853697.09631: done sending task result for task 02083763-bbaf-05ea-abc5-000000000ad1 30583 1726853697.09634: WORKER PROCESS EXITING 30583 1726853697.10638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853697.12420: done with get_vars() 30583 1726853697.12440: variable 'ansible_search_path' from source: unknown 30583 1726853697.12442: variable 'ansible_search_path' from source: unknown 30583 1726853697.12656: variable 'omit' from source: magic vars 30583 1726853697.12699: variable 'omit' from source: magic vars 30583 1726853697.12715: variable 'omit' from source: magic vars 30583 1726853697.12719: we have included files to process 30583 1726853697.12720: generating all_blocks data 30583 1726853697.12722: done generating all_blocks data 30583 1726853697.12731: processing included file: fedora.linux_system_roles.network 30583 1726853697.12752: in VariableManager get_vars() 30583 1726853697.12765: done with get_vars() 30583 1726853697.12795: in VariableManager get_vars() 30583 1726853697.12813: done with get_vars() 30583 1726853697.12858: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853697.12987: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853697.13076: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853697.13538: in VariableManager get_vars() 30583 1726853697.13558: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853697.15606: iterating over new_blocks loaded from include file 30583 1726853697.15608: in VariableManager get_vars() 30583 1726853697.15631: done with get_vars() 30583 1726853697.15633: filtering new block on tags 30583 1726853697.15985: done filtering new block on tags 30583 1726853697.15989: in VariableManager get_vars() 30583 1726853697.16008: done with get_vars() 30583 1726853697.16010: filtering new block on tags 30583 1726853697.16030: done filtering new block on tags 30583 1726853697.16033: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853697.16042: extending task lists for all hosts with included blocks 30583 1726853697.16238: done extending task lists 30583 1726853697.16240: done processing included files 30583 1726853697.16240: results queue empty 30583 1726853697.16241: checking for any_errors_fatal 30583 1726853697.16244: done checking for any_errors_fatal 30583 1726853697.16245: checking for max_fail_percentage 30583 1726853697.16246: done checking for max_fail_percentage 30583 1726853697.16247: checking to see if all hosts have failed and the running result is not ok 30583 1726853697.16247: done checking to see if all hosts have failed 30583 1726853697.16248: getting the remaining hosts for this loop 30583 1726853697.16250: done getting the remaining hosts for this loop 30583 1726853697.16253: getting the next task for host managed_node2 30583 1726853697.16257: done getting next task for host managed_node2 30583 1726853697.16259: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853697.16262: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853697.16273: getting variables 30583 1726853697.16274: in VariableManager get_vars() 30583 1726853697.16286: Calling all_inventory to load vars for managed_node2 30583 1726853697.16293: Calling groups_inventory to load vars for managed_node2 30583 1726853697.16295: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853697.16301: Calling all_plugins_play to load vars for managed_node2 30583 1726853697.16303: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853697.16306: Calling groups_plugins_play to load vars for managed_node2 30583 1726853697.17548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853697.19494: done with get_vars() 30583 1726853697.19519: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:34:57 -0400 (0:00:00.123) 0:00:32.533 ****** 30583 1726853697.19602: entering _queue_task() for managed_node2/include_tasks 30583 1726853697.20184: worker is 1 (out of 1 available) 30583 1726853697.20194: exiting _queue_task() for managed_node2/include_tasks 30583 1726853697.20205: done queuing things up, now waiting for results queue to drain 30583 1726853697.20206: waiting for pending results... 30583 1726853697.20323: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853697.20488: in run() - task 02083763-bbaf-05ea-abc5-000000000b33 30583 1726853697.20509: variable 'ansible_search_path' from source: unknown 30583 1726853697.20518: variable 'ansible_search_path' from source: unknown 30583 1726853697.20569: calling self._execute() 30583 1726853697.20764: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853697.20768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853697.20774: variable 'omit' from source: magic vars 30583 1726853697.21195: variable 'ansible_distribution_major_version' from source: facts 30583 1726853697.21220: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853697.21232: _execute() done 30583 1726853697.21252: dumping result to json 30583 1726853697.21263: done dumping result, returning 30583 1726853697.21303: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-000000000b33] 30583 1726853697.21308: sending task result for task 02083763-bbaf-05ea-abc5-000000000b33 30583 1726853697.21609: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b33 30583 1726853697.21613: WORKER PROCESS EXITING 30583 1726853697.21662: no more pending results, returning what we have 30583 1726853697.21668: in VariableManager get_vars() 30583 1726853697.21714: Calling all_inventory to load vars for managed_node2 30583 1726853697.21717: Calling groups_inventory to load vars for managed_node2 30583 1726853697.21720: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853697.21736: Calling all_plugins_play to load vars for managed_node2 30583 1726853697.21740: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853697.21743: Calling groups_plugins_play to load vars for managed_node2 30583 1726853697.23258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853697.24913: done with get_vars() 30583 1726853697.24934: variable 'ansible_search_path' from source: unknown 30583 1726853697.24935: variable 'ansible_search_path' from source: unknown 30583 1726853697.24977: we have included files to process 30583 1726853697.24985: generating all_blocks data 30583 1726853697.24987: done generating all_blocks data 30583 1726853697.24990: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853697.24992: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853697.24994: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853697.25597: done processing included file 30583 1726853697.25599: iterating over new_blocks loaded from include file 30583 1726853697.25601: in VariableManager get_vars() 30583 1726853697.25624: done with get_vars() 30583 1726853697.25626: filtering new block on tags 30583 1726853697.25662: done filtering new block on tags 30583 1726853697.25665: in VariableManager get_vars() 30583 1726853697.25687: done with get_vars() 30583 1726853697.25689: filtering new block on tags 30583 1726853697.25730: done filtering new block on tags 30583 1726853697.25733: in VariableManager get_vars() 30583 1726853697.25759: done with get_vars() 30583 1726853697.25761: filtering new block on tags 30583 1726853697.25807: done filtering new block on tags 30583 1726853697.25810: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853697.25815: extending task lists for all hosts with included blocks 30583 1726853697.27561: done extending task lists 30583 1726853697.27562: done processing included files 30583 1726853697.27563: results queue empty 30583 1726853697.27564: checking for any_errors_fatal 30583 1726853697.27567: done checking for any_errors_fatal 30583 1726853697.27568: checking for max_fail_percentage 30583 1726853697.27569: done checking for max_fail_percentage 30583 1726853697.27570: checking to see if all hosts have failed and the running result is not ok 30583 1726853697.27572: done checking to see if all hosts have failed 30583 1726853697.27573: getting the remaining hosts for this loop 30583 1726853697.27574: done getting the remaining hosts for this loop 30583 1726853697.27577: getting the next task for host managed_node2 30583 1726853697.27582: done getting next task for host managed_node2 30583 1726853697.27584: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853697.27592: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853697.27607: getting variables 30583 1726853697.27608: in VariableManager get_vars() 30583 1726853697.27621: Calling all_inventory to load vars for managed_node2 30583 1726853697.27623: Calling groups_inventory to load vars for managed_node2 30583 1726853697.27625: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853697.27630: Calling all_plugins_play to load vars for managed_node2 30583 1726853697.27632: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853697.27635: Calling groups_plugins_play to load vars for managed_node2 30583 1726853697.28915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853697.30549: done with get_vars() 30583 1726853697.30574: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:34:57 -0400 (0:00:00.110) 0:00:32.643 ****** 30583 1726853697.30667: entering _queue_task() for managed_node2/setup 30583 1726853697.31047: worker is 1 (out of 1 available) 30583 1726853697.31186: exiting _queue_task() for managed_node2/setup 30583 1726853697.31196: done queuing things up, now waiting for results queue to drain 30583 1726853697.31197: waiting for pending results... 30583 1726853697.31411: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853697.31615: in run() - task 02083763-bbaf-05ea-abc5-000000000b90 30583 1726853697.31620: variable 'ansible_search_path' from source: unknown 30583 1726853697.31623: variable 'ansible_search_path' from source: unknown 30583 1726853697.31626: calling self._execute() 30583 1726853697.31726: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853697.31738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853697.31756: variable 'omit' from source: magic vars 30583 1726853697.32163: variable 'ansible_distribution_major_version' from source: facts 30583 1726853697.32184: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853697.32416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853697.34876: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853697.34880: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853697.34883: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853697.34885: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853697.34887: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853697.34962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853697.35003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853697.35035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853697.35081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853697.35102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853697.35163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853697.35193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853697.35227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853697.35269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853697.35290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853697.35460: variable '__network_required_facts' from source: role '' defaults 30583 1726853697.35476: variable 'ansible_facts' from source: unknown 30583 1726853697.36249: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853697.36258: when evaluation is False, skipping this task 30583 1726853697.36265: _execute() done 30583 1726853697.36273: dumping result to json 30583 1726853697.36281: done dumping result, returning 30583 1726853697.36292: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-000000000b90] 30583 1726853697.36410: sending task result for task 02083763-bbaf-05ea-abc5-000000000b90 30583 1726853697.36481: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b90 30583 1726853697.36484: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853697.36557: no more pending results, returning what we have 30583 1726853697.36561: results queue empty 30583 1726853697.36562: checking for any_errors_fatal 30583 1726853697.36564: done checking for any_errors_fatal 30583 1726853697.36565: checking for max_fail_percentage 30583 1726853697.36567: done checking for max_fail_percentage 30583 1726853697.36568: checking to see if all hosts have failed and the running result is not ok 30583 1726853697.36568: done checking to see if all hosts have failed 30583 1726853697.36569: getting the remaining hosts for this loop 30583 1726853697.36573: done getting the remaining hosts for this loop 30583 1726853697.36577: getting the next task for host managed_node2 30583 1726853697.36590: done getting next task for host managed_node2 30583 1726853697.36594: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853697.36602: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853697.36626: getting variables 30583 1726853697.36628: in VariableManager get_vars() 30583 1726853697.36670: Calling all_inventory to load vars for managed_node2 30583 1726853697.36877: Calling groups_inventory to load vars for managed_node2 30583 1726853697.36880: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853697.36890: Calling all_plugins_play to load vars for managed_node2 30583 1726853697.36893: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853697.36901: Calling groups_plugins_play to load vars for managed_node2 30583 1726853697.38258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853697.40266: done with get_vars() 30583 1726853697.40291: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:34:57 -0400 (0:00:00.099) 0:00:32.743 ****** 30583 1726853697.40596: entering _queue_task() for managed_node2/stat 30583 1726853697.41131: worker is 1 (out of 1 available) 30583 1726853697.41144: exiting _queue_task() for managed_node2/stat 30583 1726853697.41156: done queuing things up, now waiting for results queue to drain 30583 1726853697.41157: waiting for pending results... 30583 1726853697.42395: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853697.42629: in run() - task 02083763-bbaf-05ea-abc5-000000000b92 30583 1726853697.42642: variable 'ansible_search_path' from source: unknown 30583 1726853697.42646: variable 'ansible_search_path' from source: unknown 30583 1726853697.42800: calling self._execute() 30583 1726853697.43003: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853697.43018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853697.43025: variable 'omit' from source: magic vars 30583 1726853697.43966: variable 'ansible_distribution_major_version' from source: facts 30583 1726853697.44044: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853697.44398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853697.45027: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853697.45123: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853697.45155: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853697.45244: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853697.45678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853697.45682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853697.45684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853697.45687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853697.45936: variable '__network_is_ostree' from source: set_fact 30583 1726853697.45943: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853697.45946: when evaluation is False, skipping this task 30583 1726853697.45948: _execute() done 30583 1726853697.45951: dumping result to json 30583 1726853697.45953: done dumping result, returning 30583 1726853697.45967: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-000000000b92] 30583 1726853697.45970: sending task result for task 02083763-bbaf-05ea-abc5-000000000b92 30583 1726853697.46076: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b92 30583 1726853697.46080: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853697.46152: no more pending results, returning what we have 30583 1726853697.46157: results queue empty 30583 1726853697.46158: checking for any_errors_fatal 30583 1726853697.46166: done checking for any_errors_fatal 30583 1726853697.46167: checking for max_fail_percentage 30583 1726853697.46169: done checking for max_fail_percentage 30583 1726853697.46170: checking to see if all hosts have failed and the running result is not ok 30583 1726853697.46173: done checking to see if all hosts have failed 30583 1726853697.46173: getting the remaining hosts for this loop 30583 1726853697.46175: done getting the remaining hosts for this loop 30583 1726853697.46180: getting the next task for host managed_node2 30583 1726853697.46189: done getting next task for host managed_node2 30583 1726853697.46194: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853697.46200: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853697.46222: getting variables 30583 1726853697.46224: in VariableManager get_vars() 30583 1726853697.46261: Calling all_inventory to load vars for managed_node2 30583 1726853697.46263: Calling groups_inventory to load vars for managed_node2 30583 1726853697.46265: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853697.46578: Calling all_plugins_play to load vars for managed_node2 30583 1726853697.46582: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853697.46586: Calling groups_plugins_play to load vars for managed_node2 30583 1726853697.50405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853697.53399: done with get_vars() 30583 1726853697.53429: done getting variables 30583 1726853697.53698: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:34:57 -0400 (0:00:00.131) 0:00:32.874 ****** 30583 1726853697.53737: entering _queue_task() for managed_node2/set_fact 30583 1726853697.54091: worker is 1 (out of 1 available) 30583 1726853697.54104: exiting _queue_task() for managed_node2/set_fact 30583 1726853697.54117: done queuing things up, now waiting for results queue to drain 30583 1726853697.54119: waiting for pending results... 30583 1726853697.54442: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853697.54703: in run() - task 02083763-bbaf-05ea-abc5-000000000b93 30583 1726853697.54707: variable 'ansible_search_path' from source: unknown 30583 1726853697.54710: variable 'ansible_search_path' from source: unknown 30583 1726853697.54713: calling self._execute() 30583 1726853697.54774: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853697.54794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853697.54820: variable 'omit' from source: magic vars 30583 1726853697.55378: variable 'ansible_distribution_major_version' from source: facts 30583 1726853697.55381: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853697.55385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853697.55636: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853697.55686: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853697.55727: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853697.55764: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853697.55853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853697.55886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853697.55921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853697.55952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853697.56062: variable '__network_is_ostree' from source: set_fact 30583 1726853697.56076: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853697.56084: when evaluation is False, skipping this task 30583 1726853697.56091: _execute() done 30583 1726853697.56098: dumping result to json 30583 1726853697.56106: done dumping result, returning 30583 1726853697.56119: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-000000000b93] 30583 1726853697.56128: sending task result for task 02083763-bbaf-05ea-abc5-000000000b93 30583 1726853697.56238: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b93 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853697.56290: no more pending results, returning what we have 30583 1726853697.56294: results queue empty 30583 1726853697.56296: checking for any_errors_fatal 30583 1726853697.56303: done checking for any_errors_fatal 30583 1726853697.56304: checking for max_fail_percentage 30583 1726853697.56306: done checking for max_fail_percentage 30583 1726853697.56307: checking to see if all hosts have failed and the running result is not ok 30583 1726853697.56308: done checking to see if all hosts have failed 30583 1726853697.56309: getting the remaining hosts for this loop 30583 1726853697.56311: done getting the remaining hosts for this loop 30583 1726853697.56315: getting the next task for host managed_node2 30583 1726853697.56326: done getting next task for host managed_node2 30583 1726853697.56331: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853697.56338: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853697.56361: getting variables 30583 1726853697.56363: in VariableManager get_vars() 30583 1726853697.56404: Calling all_inventory to load vars for managed_node2 30583 1726853697.56407: Calling groups_inventory to load vars for managed_node2 30583 1726853697.56409: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853697.56421: Calling all_plugins_play to load vars for managed_node2 30583 1726853697.56425: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853697.56429: Calling groups_plugins_play to load vars for managed_node2 30583 1726853697.57778: WORKER PROCESS EXITING 30583 1726853697.59564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853697.61364: done with get_vars() 30583 1726853697.61387: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:34:57 -0400 (0:00:00.077) 0:00:32.952 ****** 30583 1726853697.61484: entering _queue_task() for managed_node2/service_facts 30583 1726853697.62318: worker is 1 (out of 1 available) 30583 1726853697.62332: exiting _queue_task() for managed_node2/service_facts 30583 1726853697.62346: done queuing things up, now waiting for results queue to drain 30583 1726853697.62347: waiting for pending results... 30583 1726853697.62601: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853697.62877: in run() - task 02083763-bbaf-05ea-abc5-000000000b95 30583 1726853697.63007: variable 'ansible_search_path' from source: unknown 30583 1726853697.63015: variable 'ansible_search_path' from source: unknown 30583 1726853697.63058: calling self._execute() 30583 1726853697.63457: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853697.63502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853697.63553: variable 'omit' from source: magic vars 30583 1726853697.64184: variable 'ansible_distribution_major_version' from source: facts 30583 1726853697.64476: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853697.64479: variable 'omit' from source: magic vars 30583 1726853697.64490: variable 'omit' from source: magic vars 30583 1726853697.64636: variable 'omit' from source: magic vars 30583 1726853697.64685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853697.64729: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853697.64839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853697.64865: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853697.64886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853697.65030: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853697.65045: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853697.65054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853697.65164: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853697.65179: Set connection var ansible_timeout to 10 30583 1726853697.65186: Set connection var ansible_connection to ssh 30583 1726853697.65196: Set connection var ansible_shell_executable to /bin/sh 30583 1726853697.65205: Set connection var ansible_shell_type to sh 30583 1726853697.65223: Set connection var ansible_pipelining to False 30583 1726853697.65272: variable 'ansible_shell_executable' from source: unknown 30583 1726853697.65283: variable 'ansible_connection' from source: unknown 30583 1726853697.65292: variable 'ansible_module_compression' from source: unknown 30583 1726853697.65299: variable 'ansible_shell_type' from source: unknown 30583 1726853697.65306: variable 'ansible_shell_executable' from source: unknown 30583 1726853697.65313: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853697.65320: variable 'ansible_pipelining' from source: unknown 30583 1726853697.65327: variable 'ansible_timeout' from source: unknown 30583 1726853697.65334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853697.65569: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853697.65593: variable 'omit' from source: magic vars 30583 1726853697.65604: starting attempt loop 30583 1726853697.65611: running the handler 30583 1726853697.65627: _low_level_execute_command(): starting 30583 1726853697.65638: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853697.66398: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853697.66474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853697.66493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853697.66790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853697.66977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853697.68636: stdout chunk (state=3): >>>/root <<< 30583 1726853697.68776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853697.68788: stdout chunk (state=3): >>><<< 30583 1726853697.68801: stderr chunk (state=3): >>><<< 30583 1726853697.68824: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853697.68950: _low_level_execute_command(): starting 30583 1726853697.68954: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595 `" && echo ansible-tmp-1726853697.6888862-32109-86893395042595="` echo /root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595 `" ) && sleep 0' 30583 1726853697.70264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853697.70434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853697.70476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853697.70530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853697.72595: stdout chunk (state=3): >>>ansible-tmp-1726853697.6888862-32109-86893395042595=/root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595 <<< 30583 1726853697.72703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853697.72738: stderr chunk (state=3): >>><<< 30583 1726853697.72743: stdout chunk (state=3): >>><<< 30583 1726853697.72764: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853697.6888862-32109-86893395042595=/root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853697.72813: variable 'ansible_module_compression' from source: unknown 30583 1726853697.72858: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853697.72900: variable 'ansible_facts' from source: unknown 30583 1726853697.72993: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595/AnsiballZ_service_facts.py 30583 1726853697.73297: Sending initial data 30583 1726853697.73300: Sent initial data (161 bytes) 30583 1726853697.73755: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853697.73790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853697.73905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853697.73919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853697.74086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853697.74306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853697.76048: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853697.76196: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853697.76352: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpe1r2y5n7 /root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595/AnsiballZ_service_facts.py <<< 30583 1726853697.76357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595/AnsiballZ_service_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpe1r2y5n7" to remote "/root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595/AnsiballZ_service_facts.py" <<< 30583 1726853697.77663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853697.77735: stderr chunk (state=3): >>><<< 30583 1726853697.77738: stdout chunk (state=3): >>><<< 30583 1726853697.77792: done transferring module to remote 30583 1726853697.77865: _low_level_execute_command(): starting 30583 1726853697.77868: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595/ /root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595/AnsiballZ_service_facts.py && sleep 0' 30583 1726853697.78995: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853697.79044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853697.79062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853697.79079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853697.79246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853697.81180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853697.81230: stderr chunk (state=3): >>><<< 30583 1726853697.81235: stdout chunk (state=3): >>><<< 30583 1726853697.81257: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853697.81266: _low_level_execute_command(): starting 30583 1726853697.81269: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595/AnsiballZ_service_facts.py && sleep 0' 30583 1726853697.81853: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853697.81865: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853697.81878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853697.81893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853697.81905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853697.81912: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853697.81921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853697.81936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853697.81945: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853697.82009: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853697.82012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853697.82015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853697.82017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853697.82020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853697.82022: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853697.82024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853697.82086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853697.82118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853697.82121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853697.82217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853699.42886: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853699.44550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853699.44554: stdout chunk (state=3): >>><<< 30583 1726853699.44561: stderr chunk (state=3): >>><<< 30583 1726853699.44591: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853699.46380: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853699.46384: _low_level_execute_command(): starting 30583 1726853699.46387: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853697.6888862-32109-86893395042595/ > /dev/null 2>&1 && sleep 0' 30583 1726853699.47749: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853699.47815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853699.47922: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853699.47999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853699.48120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853699.48145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853699.48253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853699.50226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853699.50230: stdout chunk (state=3): >>><<< 30583 1726853699.50236: stderr chunk (state=3): >>><<< 30583 1726853699.50253: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853699.50262: handler run complete 30583 1726853699.50712: variable 'ansible_facts' from source: unknown 30583 1726853699.51094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853699.52361: variable 'ansible_facts' from source: unknown 30583 1726853699.52679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853699.53070: attempt loop complete, returning result 30583 1726853699.53178: _execute() done 30583 1726853699.53182: dumping result to json 30583 1726853699.53379: done dumping result, returning 30583 1726853699.53383: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-000000000b95] 30583 1726853699.53385: sending task result for task 02083763-bbaf-05ea-abc5-000000000b95 30583 1726853699.54961: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b95 30583 1726853699.54965: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853699.55093: no more pending results, returning what we have 30583 1726853699.55098: results queue empty 30583 1726853699.55099: checking for any_errors_fatal 30583 1726853699.55105: done checking for any_errors_fatal 30583 1726853699.55106: checking for max_fail_percentage 30583 1726853699.55109: done checking for max_fail_percentage 30583 1726853699.55110: checking to see if all hosts have failed and the running result is not ok 30583 1726853699.55110: done checking to see if all hosts have failed 30583 1726853699.55111: getting the remaining hosts for this loop 30583 1726853699.55114: done getting the remaining hosts for this loop 30583 1726853699.55118: getting the next task for host managed_node2 30583 1726853699.55126: done getting next task for host managed_node2 30583 1726853699.55130: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853699.55138: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853699.55152: getting variables 30583 1726853699.55154: in VariableManager get_vars() 30583 1726853699.55313: Calling all_inventory to load vars for managed_node2 30583 1726853699.55316: Calling groups_inventory to load vars for managed_node2 30583 1726853699.55319: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853699.55328: Calling all_plugins_play to load vars for managed_node2 30583 1726853699.55331: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853699.55334: Calling groups_plugins_play to load vars for managed_node2 30583 1726853699.59015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853699.62800: done with get_vars() 30583 1726853699.62826: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:34:59 -0400 (0:00:02.015) 0:00:34.967 ****** 30583 1726853699.63060: entering _queue_task() for managed_node2/package_facts 30583 1726853699.63933: worker is 1 (out of 1 available) 30583 1726853699.63947: exiting _queue_task() for managed_node2/package_facts 30583 1726853699.63960: done queuing things up, now waiting for results queue to drain 30583 1726853699.63962: waiting for pending results... 30583 1726853699.64658: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853699.64767: in run() - task 02083763-bbaf-05ea-abc5-000000000b96 30583 1726853699.64781: variable 'ansible_search_path' from source: unknown 30583 1726853699.64785: variable 'ansible_search_path' from source: unknown 30583 1726853699.65079: calling self._execute() 30583 1726853699.65083: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853699.65086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853699.65089: variable 'omit' from source: magic vars 30583 1726853699.65344: variable 'ansible_distribution_major_version' from source: facts 30583 1726853699.65354: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853699.65360: variable 'omit' from source: magic vars 30583 1726853699.65448: variable 'omit' from source: magic vars 30583 1726853699.65490: variable 'omit' from source: magic vars 30583 1726853699.65536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853699.65567: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853699.65687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853699.65876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853699.65879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853699.65882: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853699.65884: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853699.65887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853699.66133: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853699.66141: Set connection var ansible_timeout to 10 30583 1726853699.66153: Set connection var ansible_connection to ssh 30583 1726853699.66163: Set connection var ansible_shell_executable to /bin/sh 30583 1726853699.66166: Set connection var ansible_shell_type to sh 30583 1726853699.66172: Set connection var ansible_pipelining to False 30583 1726853699.66200: variable 'ansible_shell_executable' from source: unknown 30583 1726853699.66204: variable 'ansible_connection' from source: unknown 30583 1726853699.66207: variable 'ansible_module_compression' from source: unknown 30583 1726853699.66209: variable 'ansible_shell_type' from source: unknown 30583 1726853699.66212: variable 'ansible_shell_executable' from source: unknown 30583 1726853699.66214: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853699.66216: variable 'ansible_pipelining' from source: unknown 30583 1726853699.66218: variable 'ansible_timeout' from source: unknown 30583 1726853699.66223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853699.66705: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853699.66787: variable 'omit' from source: magic vars 30583 1726853699.66799: starting attempt loop 30583 1726853699.66802: running the handler 30583 1726853699.66816: _low_level_execute_command(): starting 30583 1726853699.66831: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853699.67737: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853699.67788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853699.67822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853699.67846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853699.67861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853699.67967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853699.69935: stdout chunk (state=3): >>>/root <<< 30583 1726853699.69939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853699.69942: stdout chunk (state=3): >>><<< 30583 1726853699.69952: stderr chunk (state=3): >>><<< 30583 1726853699.69978: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853699.70025: _low_level_execute_command(): starting 30583 1726853699.70029: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233 `" && echo ansible-tmp-1726853699.6997497-32198-234575682668233="` echo /root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233 `" ) && sleep 0' 30583 1726853699.70916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853699.70925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853699.70935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853699.71005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853699.71009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853699.71019: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853699.71030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853699.71034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853699.71037: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853699.71039: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853699.71041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853699.71043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853699.71045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853699.71047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853699.71048: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853699.71050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853699.71111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853699.71201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853699.71211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853699.71311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853699.73284: stdout chunk (state=3): >>>ansible-tmp-1726853699.6997497-32198-234575682668233=/root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233 <<< 30583 1726853699.73449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853699.73453: stdout chunk (state=3): >>><<< 30583 1726853699.73458: stderr chunk (state=3): >>><<< 30583 1726853699.73677: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853699.6997497-32198-234575682668233=/root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853699.73681: variable 'ansible_module_compression' from source: unknown 30583 1726853699.73684: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853699.73686: variable 'ansible_facts' from source: unknown 30583 1726853699.73869: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233/AnsiballZ_package_facts.py 30583 1726853699.74097: Sending initial data 30583 1726853699.74147: Sent initial data (162 bytes) 30583 1726853699.75335: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853699.75364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853699.75387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853699.75588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853699.75750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853699.77410: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853699.77504: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853699.77599: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpfxkhyiws /root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233/AnsiballZ_package_facts.py <<< 30583 1726853699.77625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233/AnsiballZ_package_facts.py" <<< 30583 1726853699.77697: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpfxkhyiws" to remote "/root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233/AnsiballZ_package_facts.py" <<< 30583 1726853699.79513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853699.79517: stdout chunk (state=3): >>><<< 30583 1726853699.79519: stderr chunk (state=3): >>><<< 30583 1726853699.79523: done transferring module to remote 30583 1726853699.79525: _low_level_execute_command(): starting 30583 1726853699.79527: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233/ /root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233/AnsiballZ_package_facts.py && sleep 0' 30583 1726853699.80938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853699.81008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853699.81082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853699.83052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853699.83056: stdout chunk (state=3): >>><<< 30583 1726853699.83059: stderr chunk (state=3): >>><<< 30583 1726853699.83164: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853699.83169: _low_level_execute_command(): starting 30583 1726853699.83185: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233/AnsiballZ_package_facts.py && sleep 0' 30583 1726853699.83713: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853699.83717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853699.83719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853699.83722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853699.83724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853699.83777: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853699.83784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853699.83825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853699.83838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853699.83847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853699.83953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853700.29243: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30583 1726853700.29318: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30583 1726853700.29403: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30583 1726853700.29424: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30583 1726853700.29429: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30583 1726853700.29466: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853700.31548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853700.31552: stdout chunk (state=3): >>><<< 30583 1726853700.31554: stderr chunk (state=3): >>><<< 30583 1726853700.32067: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853700.35123: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853700.35146: _low_level_execute_command(): starting 30583 1726853700.35162: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853699.6997497-32198-234575682668233/ > /dev/null 2>&1 && sleep 0' 30583 1726853700.35974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853700.35990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853700.36023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853700.36031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853700.36037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853700.36040: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853700.36042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853700.36132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853700.36137: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853700.36140: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853700.36153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853700.36163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853700.36460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853700.36494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853700.38413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853700.38445: stderr chunk (state=3): >>><<< 30583 1726853700.38448: stdout chunk (state=3): >>><<< 30583 1726853700.38476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853700.38478: handler run complete 30583 1726853700.39021: variable 'ansible_facts' from source: unknown 30583 1726853700.39299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853700.40847: variable 'ansible_facts' from source: unknown 30583 1726853700.41118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853700.41528: attempt loop complete, returning result 30583 1726853700.41536: _execute() done 30583 1726853700.41539: dumping result to json 30583 1726853700.41655: done dumping result, returning 30583 1726853700.41666: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-000000000b96] 30583 1726853700.41673: sending task result for task 02083763-bbaf-05ea-abc5-000000000b96 30583 1726853700.43374: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b96 30583 1726853700.43378: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853700.43522: no more pending results, returning what we have 30583 1726853700.43525: results queue empty 30583 1726853700.43526: checking for any_errors_fatal 30583 1726853700.43532: done checking for any_errors_fatal 30583 1726853700.43533: checking for max_fail_percentage 30583 1726853700.43534: done checking for max_fail_percentage 30583 1726853700.43535: checking to see if all hosts have failed and the running result is not ok 30583 1726853700.43536: done checking to see if all hosts have failed 30583 1726853700.43536: getting the remaining hosts for this loop 30583 1726853700.43538: done getting the remaining hosts for this loop 30583 1726853700.43541: getting the next task for host managed_node2 30583 1726853700.43548: done getting next task for host managed_node2 30583 1726853700.43551: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853700.43556: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853700.43566: getting variables 30583 1726853700.43567: in VariableManager get_vars() 30583 1726853700.43595: Calling all_inventory to load vars for managed_node2 30583 1726853700.43598: Calling groups_inventory to load vars for managed_node2 30583 1726853700.43600: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853700.43607: Calling all_plugins_play to load vars for managed_node2 30583 1726853700.43610: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853700.43617: Calling groups_plugins_play to load vars for managed_node2 30583 1726853700.45986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853700.47758: done with get_vars() 30583 1726853700.47794: done getting variables 30583 1726853700.47860: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:00 -0400 (0:00:00.849) 0:00:35.816 ****** 30583 1726853700.48018: entering _queue_task() for managed_node2/debug 30583 1726853700.48610: worker is 1 (out of 1 available) 30583 1726853700.48621: exiting _queue_task() for managed_node2/debug 30583 1726853700.48634: done queuing things up, now waiting for results queue to drain 30583 1726853700.48635: waiting for pending results... 30583 1726853700.49005: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853700.49225: in run() - task 02083763-bbaf-05ea-abc5-000000000b34 30583 1726853700.49245: variable 'ansible_search_path' from source: unknown 30583 1726853700.49253: variable 'ansible_search_path' from source: unknown 30583 1726853700.49297: calling self._execute() 30583 1726853700.49408: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853700.49430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853700.49534: variable 'omit' from source: magic vars 30583 1726853700.49861: variable 'ansible_distribution_major_version' from source: facts 30583 1726853700.49881: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853700.49893: variable 'omit' from source: magic vars 30583 1726853700.49967: variable 'omit' from source: magic vars 30583 1726853700.50076: variable 'network_provider' from source: set_fact 30583 1726853700.50100: variable 'omit' from source: magic vars 30583 1726853700.50148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853700.50197: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853700.50224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853700.50250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853700.50267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853700.50313: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853700.50322: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853700.50330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853700.50510: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853700.50516: Set connection var ansible_timeout to 10 30583 1726853700.50519: Set connection var ansible_connection to ssh 30583 1726853700.50521: Set connection var ansible_shell_executable to /bin/sh 30583 1726853700.50523: Set connection var ansible_shell_type to sh 30583 1726853700.50526: Set connection var ansible_pipelining to False 30583 1726853700.50527: variable 'ansible_shell_executable' from source: unknown 30583 1726853700.50530: variable 'ansible_connection' from source: unknown 30583 1726853700.50532: variable 'ansible_module_compression' from source: unknown 30583 1726853700.50533: variable 'ansible_shell_type' from source: unknown 30583 1726853700.50535: variable 'ansible_shell_executable' from source: unknown 30583 1726853700.50538: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853700.50547: variable 'ansible_pipelining' from source: unknown 30583 1726853700.50553: variable 'ansible_timeout' from source: unknown 30583 1726853700.50560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853700.50714: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853700.50740: variable 'omit' from source: magic vars 30583 1726853700.50749: starting attempt loop 30583 1726853700.50835: running the handler 30583 1726853700.50839: handler run complete 30583 1726853700.50842: attempt loop complete, returning result 30583 1726853700.50845: _execute() done 30583 1726853700.50847: dumping result to json 30583 1726853700.50850: done dumping result, returning 30583 1726853700.50856: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-000000000b34] 30583 1726853700.50866: sending task result for task 02083763-bbaf-05ea-abc5-000000000b34 ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853700.51058: no more pending results, returning what we have 30583 1726853700.51061: results queue empty 30583 1726853700.51062: checking for any_errors_fatal 30583 1726853700.51277: done checking for any_errors_fatal 30583 1726853700.51278: checking for max_fail_percentage 30583 1726853700.51280: done checking for max_fail_percentage 30583 1726853700.51281: checking to see if all hosts have failed and the running result is not ok 30583 1726853700.51282: done checking to see if all hosts have failed 30583 1726853700.51283: getting the remaining hosts for this loop 30583 1726853700.51285: done getting the remaining hosts for this loop 30583 1726853700.51289: getting the next task for host managed_node2 30583 1726853700.51297: done getting next task for host managed_node2 30583 1726853700.51302: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853700.51306: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853700.51321: getting variables 30583 1726853700.51323: in VariableManager get_vars() 30583 1726853700.51365: Calling all_inventory to load vars for managed_node2 30583 1726853700.51368: Calling groups_inventory to load vars for managed_node2 30583 1726853700.51775: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853700.51786: Calling all_plugins_play to load vars for managed_node2 30583 1726853700.51789: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853700.51793: Calling groups_plugins_play to load vars for managed_node2 30583 1726853700.52496: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b34 30583 1726853700.52500: WORKER PROCESS EXITING 30583 1726853700.54732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853700.58124: done with get_vars() 30583 1726853700.58162: done getting variables 30583 1726853700.58218: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:00 -0400 (0:00:00.102) 0:00:35.919 ****** 30583 1726853700.58259: entering _queue_task() for managed_node2/fail 30583 1726853700.59112: worker is 1 (out of 1 available) 30583 1726853700.59124: exiting _queue_task() for managed_node2/fail 30583 1726853700.59137: done queuing things up, now waiting for results queue to drain 30583 1726853700.59138: waiting for pending results... 30583 1726853700.59988: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853700.60633: in run() - task 02083763-bbaf-05ea-abc5-000000000b35 30583 1726853700.60637: variable 'ansible_search_path' from source: unknown 30583 1726853700.60641: variable 'ansible_search_path' from source: unknown 30583 1726853700.61067: calling self._execute() 30583 1726853700.61282: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853700.61286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853700.61293: variable 'omit' from source: magic vars 30583 1726853700.62479: variable 'ansible_distribution_major_version' from source: facts 30583 1726853700.62503: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853700.62961: variable 'network_state' from source: role '' defaults 30583 1726853700.63092: Evaluated conditional (network_state != {}): False 30583 1726853700.63103: when evaluation is False, skipping this task 30583 1726853700.63110: _execute() done 30583 1726853700.63125: dumping result to json 30583 1726853700.63132: done dumping result, returning 30583 1726853700.63144: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-000000000b35] 30583 1726853700.63152: sending task result for task 02083763-bbaf-05ea-abc5-000000000b35 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853700.63318: no more pending results, returning what we have 30583 1726853700.63322: results queue empty 30583 1726853700.63323: checking for any_errors_fatal 30583 1726853700.63331: done checking for any_errors_fatal 30583 1726853700.63332: checking for max_fail_percentage 30583 1726853700.63335: done checking for max_fail_percentage 30583 1726853700.63335: checking to see if all hosts have failed and the running result is not ok 30583 1726853700.63336: done checking to see if all hosts have failed 30583 1726853700.63337: getting the remaining hosts for this loop 30583 1726853700.63339: done getting the remaining hosts for this loop 30583 1726853700.63343: getting the next task for host managed_node2 30583 1726853700.63352: done getting next task for host managed_node2 30583 1726853700.63357: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853700.63363: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853700.63392: getting variables 30583 1726853700.63395: in VariableManager get_vars() 30583 1726853700.63440: Calling all_inventory to load vars for managed_node2 30583 1726853700.63444: Calling groups_inventory to load vars for managed_node2 30583 1726853700.63447: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853700.63460: Calling all_plugins_play to load vars for managed_node2 30583 1726853700.63463: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853700.63466: Calling groups_plugins_play to load vars for managed_node2 30583 1726853700.64581: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b35 30583 1726853700.64586: WORKER PROCESS EXITING 30583 1726853700.67010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853700.75882: done with get_vars() 30583 1726853700.75911: done getting variables 30583 1726853700.76149: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:00 -0400 (0:00:00.179) 0:00:36.099 ****** 30583 1726853700.76188: entering _queue_task() for managed_node2/fail 30583 1726853700.76965: worker is 1 (out of 1 available) 30583 1726853700.76983: exiting _queue_task() for managed_node2/fail 30583 1726853700.76997: done queuing things up, now waiting for results queue to drain 30583 1726853700.76998: waiting for pending results... 30583 1726853700.77444: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853700.77647: in run() - task 02083763-bbaf-05ea-abc5-000000000b36 30583 1726853700.77670: variable 'ansible_search_path' from source: unknown 30583 1726853700.77681: variable 'ansible_search_path' from source: unknown 30583 1726853700.77732: calling self._execute() 30583 1726853700.77841: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853700.77858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853700.77876: variable 'omit' from source: magic vars 30583 1726853700.78277: variable 'ansible_distribution_major_version' from source: facts 30583 1726853700.78296: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853700.78426: variable 'network_state' from source: role '' defaults 30583 1726853700.78443: Evaluated conditional (network_state != {}): False 30583 1726853700.78450: when evaluation is False, skipping this task 30583 1726853700.78460: _execute() done 30583 1726853700.78501: dumping result to json 30583 1726853700.78504: done dumping result, returning 30583 1726853700.78507: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-000000000b36] 30583 1726853700.78509: sending task result for task 02083763-bbaf-05ea-abc5-000000000b36 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853700.78662: no more pending results, returning what we have 30583 1726853700.78667: results queue empty 30583 1726853700.78668: checking for any_errors_fatal 30583 1726853700.78679: done checking for any_errors_fatal 30583 1726853700.78680: checking for max_fail_percentage 30583 1726853700.78682: done checking for max_fail_percentage 30583 1726853700.78683: checking to see if all hosts have failed and the running result is not ok 30583 1726853700.78684: done checking to see if all hosts have failed 30583 1726853700.78685: getting the remaining hosts for this loop 30583 1726853700.78687: done getting the remaining hosts for this loop 30583 1726853700.78691: getting the next task for host managed_node2 30583 1726853700.78702: done getting next task for host managed_node2 30583 1726853700.78707: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853700.78713: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853700.78742: getting variables 30583 1726853700.78745: in VariableManager get_vars() 30583 1726853700.79088: Calling all_inventory to load vars for managed_node2 30583 1726853700.79091: Calling groups_inventory to load vars for managed_node2 30583 1726853700.79093: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853700.79103: Calling all_plugins_play to load vars for managed_node2 30583 1726853700.79105: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853700.79109: Calling groups_plugins_play to load vars for managed_node2 30583 1726853700.79811: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b36 30583 1726853700.79815: WORKER PROCESS EXITING 30583 1726853700.80912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853700.83334: done with get_vars() 30583 1726853700.83365: done getting variables 30583 1726853700.83425: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:00 -0400 (0:00:00.072) 0:00:36.171 ****** 30583 1726853700.83467: entering _queue_task() for managed_node2/fail 30583 1726853700.84009: worker is 1 (out of 1 available) 30583 1726853700.84022: exiting _queue_task() for managed_node2/fail 30583 1726853700.84035: done queuing things up, now waiting for results queue to drain 30583 1726853700.84036: waiting for pending results... 30583 1726853700.84422: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853700.84628: in run() - task 02083763-bbaf-05ea-abc5-000000000b37 30583 1726853700.84651: variable 'ansible_search_path' from source: unknown 30583 1726853700.84659: variable 'ansible_search_path' from source: unknown 30583 1726853700.84699: calling self._execute() 30583 1726853700.84811: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853700.84848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853700.84864: variable 'omit' from source: magic vars 30583 1726853700.85306: variable 'ansible_distribution_major_version' from source: facts 30583 1726853700.85322: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853700.85497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853700.88073: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853700.88243: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853700.88329: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853700.88486: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853700.88518: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853700.88728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853700.88765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853700.88828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853700.88875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853700.88956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853700.89151: variable 'ansible_distribution_major_version' from source: facts 30583 1726853700.89176: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853700.89411: variable 'ansible_distribution' from source: facts 30583 1726853700.89441: variable '__network_rh_distros' from source: role '' defaults 30583 1726853700.89463: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853700.89908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853700.89944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853700.89976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853700.90032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853700.90103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853700.90143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853700.90175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853700.90213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853700.90283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853700.90346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853700.90413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853700.90451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853700.90548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853700.90552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853700.90565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853700.90956: variable 'network_connections' from source: include params 30583 1726853700.90980: variable 'interface' from source: play vars 30583 1726853700.91057: variable 'interface' from source: play vars 30583 1726853700.91084: variable 'network_state' from source: role '' defaults 30583 1726853700.91181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853700.91861: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853700.91866: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853700.91907: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853700.91943: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853700.92039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853700.92092: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853700.92131: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853700.92167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853700.92392: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853700.92396: when evaluation is False, skipping this task 30583 1726853700.92398: _execute() done 30583 1726853700.92400: dumping result to json 30583 1726853700.92402: done dumping result, returning 30583 1726853700.92404: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-000000000b37] 30583 1726853700.92407: sending task result for task 02083763-bbaf-05ea-abc5-000000000b37 30583 1726853700.92483: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b37 30583 1726853700.92487: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853700.92548: no more pending results, returning what we have 30583 1726853700.92553: results queue empty 30583 1726853700.92554: checking for any_errors_fatal 30583 1726853700.92559: done checking for any_errors_fatal 30583 1726853700.92560: checking for max_fail_percentage 30583 1726853700.92562: done checking for max_fail_percentage 30583 1726853700.92563: checking to see if all hosts have failed and the running result is not ok 30583 1726853700.92564: done checking to see if all hosts have failed 30583 1726853700.92565: getting the remaining hosts for this loop 30583 1726853700.92567: done getting the remaining hosts for this loop 30583 1726853700.92572: getting the next task for host managed_node2 30583 1726853700.92581: done getting next task for host managed_node2 30583 1726853700.92585: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853700.92592: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853700.92622: getting variables 30583 1726853700.92625: in VariableManager get_vars() 30583 1726853700.92721: Calling all_inventory to load vars for managed_node2 30583 1726853700.92725: Calling groups_inventory to load vars for managed_node2 30583 1726853700.92728: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853700.92740: Calling all_plugins_play to load vars for managed_node2 30583 1726853700.92744: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853700.92747: Calling groups_plugins_play to load vars for managed_node2 30583 1726853700.95245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853700.97701: done with get_vars() 30583 1726853700.97734: done getting variables 30583 1726853700.97826: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:00 -0400 (0:00:00.145) 0:00:36.317 ****** 30583 1726853700.98005: entering _queue_task() for managed_node2/dnf 30583 1726853700.98984: worker is 1 (out of 1 available) 30583 1726853700.98998: exiting _queue_task() for managed_node2/dnf 30583 1726853700.99012: done queuing things up, now waiting for results queue to drain 30583 1726853700.99015: waiting for pending results... 30583 1726853700.99655: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853701.00151: in run() - task 02083763-bbaf-05ea-abc5-000000000b38 30583 1726853701.00155: variable 'ansible_search_path' from source: unknown 30583 1726853701.00159: variable 'ansible_search_path' from source: unknown 30583 1726853701.00162: calling self._execute() 30583 1726853701.00483: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853701.00486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853701.00488: variable 'omit' from source: magic vars 30583 1726853701.01726: variable 'ansible_distribution_major_version' from source: facts 30583 1726853701.01745: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853701.02381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853701.07468: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853701.07678: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853701.07740: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853701.07858: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853701.07891: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853701.08078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.08185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.08578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.08582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.08585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.08773: variable 'ansible_distribution' from source: facts 30583 1726853701.09120: variable 'ansible_distribution_major_version' from source: facts 30583 1726853701.09123: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853701.09276: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853701.09738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.09949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.10105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.10197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.10263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.10578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.10582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.10584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.10877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.10881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.10884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.11018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.11049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.11202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.11435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.11817: variable 'network_connections' from source: include params 30583 1726853701.11891: variable 'interface' from source: play vars 30583 1726853701.11968: variable 'interface' from source: play vars 30583 1726853701.12307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853701.12977: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853701.13123: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853701.13158: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853701.13308: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853701.13531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853701.13564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853701.13676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.13708: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853701.13909: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853701.14709: variable 'network_connections' from source: include params 30583 1726853701.14869: variable 'interface' from source: play vars 30583 1726853701.15040: variable 'interface' from source: play vars 30583 1726853701.15295: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853701.15298: when evaluation is False, skipping this task 30583 1726853701.15301: _execute() done 30583 1726853701.15303: dumping result to json 30583 1726853701.15305: done dumping result, returning 30583 1726853701.15307: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000b38] 30583 1726853701.15309: sending task result for task 02083763-bbaf-05ea-abc5-000000000b38 30583 1726853701.15381: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b38 30583 1726853701.15384: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853701.15447: no more pending results, returning what we have 30583 1726853701.15450: results queue empty 30583 1726853701.15451: checking for any_errors_fatal 30583 1726853701.15458: done checking for any_errors_fatal 30583 1726853701.15459: checking for max_fail_percentage 30583 1726853701.15461: done checking for max_fail_percentage 30583 1726853701.15462: checking to see if all hosts have failed and the running result is not ok 30583 1726853701.15462: done checking to see if all hosts have failed 30583 1726853701.15463: getting the remaining hosts for this loop 30583 1726853701.15465: done getting the remaining hosts for this loop 30583 1726853701.15469: getting the next task for host managed_node2 30583 1726853701.15478: done getting next task for host managed_node2 30583 1726853701.15483: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853701.15489: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853701.15508: getting variables 30583 1726853701.15510: in VariableManager get_vars() 30583 1726853701.15548: Calling all_inventory to load vars for managed_node2 30583 1726853701.15551: Calling groups_inventory to load vars for managed_node2 30583 1726853701.15554: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853701.15565: Calling all_plugins_play to load vars for managed_node2 30583 1726853701.15568: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853701.15876: Calling groups_plugins_play to load vars for managed_node2 30583 1726853701.19134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853701.23403: done with get_vars() 30583 1726853701.23439: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853701.23517: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:01 -0400 (0:00:00.255) 0:00:36.572 ****** 30583 1726853701.23551: entering _queue_task() for managed_node2/yum 30583 1726853701.24516: worker is 1 (out of 1 available) 30583 1726853701.24529: exiting _queue_task() for managed_node2/yum 30583 1726853701.24542: done queuing things up, now waiting for results queue to drain 30583 1726853701.24544: waiting for pending results... 30583 1726853701.24938: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853701.25308: in run() - task 02083763-bbaf-05ea-abc5-000000000b39 30583 1726853701.25379: variable 'ansible_search_path' from source: unknown 30583 1726853701.25382: variable 'ansible_search_path' from source: unknown 30583 1726853701.25411: calling self._execute() 30583 1726853701.25629: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853701.25636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853701.25647: variable 'omit' from source: magic vars 30583 1726853701.26969: variable 'ansible_distribution_major_version' from source: facts 30583 1726853701.27242: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853701.27638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853701.33778: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853701.33782: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853701.33800: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853701.33963: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853701.33989: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853701.34288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.34316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.34342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.34503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.34518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.34896: variable 'ansible_distribution_major_version' from source: facts 30583 1726853701.34900: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853701.34902: when evaluation is False, skipping this task 30583 1726853701.34905: _execute() done 30583 1726853701.34907: dumping result to json 30583 1726853701.34909: done dumping result, returning 30583 1726853701.34911: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000b39] 30583 1726853701.34914: sending task result for task 02083763-bbaf-05ea-abc5-000000000b39 30583 1726853701.34993: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b39 30583 1726853701.34996: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853701.35048: no more pending results, returning what we have 30583 1726853701.35051: results queue empty 30583 1726853701.35052: checking for any_errors_fatal 30583 1726853701.35058: done checking for any_errors_fatal 30583 1726853701.35058: checking for max_fail_percentage 30583 1726853701.35060: done checking for max_fail_percentage 30583 1726853701.35061: checking to see if all hosts have failed and the running result is not ok 30583 1726853701.35062: done checking to see if all hosts have failed 30583 1726853701.35062: getting the remaining hosts for this loop 30583 1726853701.35064: done getting the remaining hosts for this loop 30583 1726853701.35067: getting the next task for host managed_node2 30583 1726853701.35077: done getting next task for host managed_node2 30583 1726853701.35081: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853701.35086: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853701.35107: getting variables 30583 1726853701.35109: in VariableManager get_vars() 30583 1726853701.35148: Calling all_inventory to load vars for managed_node2 30583 1726853701.35151: Calling groups_inventory to load vars for managed_node2 30583 1726853701.35154: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853701.35165: Calling all_plugins_play to load vars for managed_node2 30583 1726853701.35169: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853701.35542: Calling groups_plugins_play to load vars for managed_node2 30583 1726853701.38492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853701.41616: done with get_vars() 30583 1726853701.41646: done getting variables 30583 1726853701.41708: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:01 -0400 (0:00:00.181) 0:00:36.754 ****** 30583 1726853701.41745: entering _queue_task() for managed_node2/fail 30583 1726853701.42531: worker is 1 (out of 1 available) 30583 1726853701.42547: exiting _queue_task() for managed_node2/fail 30583 1726853701.42559: done queuing things up, now waiting for results queue to drain 30583 1726853701.42560: waiting for pending results... 30583 1726853701.43215: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853701.43503: in run() - task 02083763-bbaf-05ea-abc5-000000000b3a 30583 1726853701.43525: variable 'ansible_search_path' from source: unknown 30583 1726853701.43528: variable 'ansible_search_path' from source: unknown 30583 1726853701.43635: calling self._execute() 30583 1726853701.43766: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853701.43876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853701.43895: variable 'omit' from source: magic vars 30583 1726853701.45211: variable 'ansible_distribution_major_version' from source: facts 30583 1726853701.45224: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853701.45467: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853701.45892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853701.48979: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853701.49279: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853701.49437: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853701.49440: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853701.49468: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853701.49580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.49608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.49749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.49850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.49945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.49949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.50052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.50084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.50125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.50139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.50294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.50316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.50339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.50489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.50504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.50915: variable 'network_connections' from source: include params 30583 1726853701.50927: variable 'interface' from source: play vars 30583 1726853701.51140: variable 'interface' from source: play vars 30583 1726853701.51144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853701.51298: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853701.51380: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853701.51410: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853701.51443: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853701.51493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853701.51523: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853701.51684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.51687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853701.51689: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853701.51947: variable 'network_connections' from source: include params 30583 1726853701.51950: variable 'interface' from source: play vars 30583 1726853701.52022: variable 'interface' from source: play vars 30583 1726853701.52055: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853701.52061: when evaluation is False, skipping this task 30583 1726853701.52064: _execute() done 30583 1726853701.52067: dumping result to json 30583 1726853701.52072: done dumping result, returning 30583 1726853701.52082: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000b3a] 30583 1726853701.52087: sending task result for task 02083763-bbaf-05ea-abc5-000000000b3a 30583 1726853701.52195: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b3a 30583 1726853701.52199: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853701.52267: no more pending results, returning what we have 30583 1726853701.52272: results queue empty 30583 1726853701.52273: checking for any_errors_fatal 30583 1726853701.52281: done checking for any_errors_fatal 30583 1726853701.52281: checking for max_fail_percentage 30583 1726853701.52283: done checking for max_fail_percentage 30583 1726853701.52284: checking to see if all hosts have failed and the running result is not ok 30583 1726853701.52285: done checking to see if all hosts have failed 30583 1726853701.52285: getting the remaining hosts for this loop 30583 1726853701.52287: done getting the remaining hosts for this loop 30583 1726853701.52291: getting the next task for host managed_node2 30583 1726853701.52299: done getting next task for host managed_node2 30583 1726853701.52302: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853701.52307: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853701.52327: getting variables 30583 1726853701.52328: in VariableManager get_vars() 30583 1726853701.52364: Calling all_inventory to load vars for managed_node2 30583 1726853701.52367: Calling groups_inventory to load vars for managed_node2 30583 1726853701.52369: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853701.52381: Calling all_plugins_play to load vars for managed_node2 30583 1726853701.52384: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853701.52387: Calling groups_plugins_play to load vars for managed_node2 30583 1726853701.55085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853701.58157: done with get_vars() 30583 1726853701.58303: done getting variables 30583 1726853701.58364: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:01 -0400 (0:00:00.166) 0:00:36.921 ****** 30583 1726853701.58404: entering _queue_task() for managed_node2/package 30583 1726853701.59306: worker is 1 (out of 1 available) 30583 1726853701.59319: exiting _queue_task() for managed_node2/package 30583 1726853701.59332: done queuing things up, now waiting for results queue to drain 30583 1726853701.59334: waiting for pending results... 30583 1726853701.59926: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853701.60154: in run() - task 02083763-bbaf-05ea-abc5-000000000b3b 30583 1726853701.60169: variable 'ansible_search_path' from source: unknown 30583 1726853701.60175: variable 'ansible_search_path' from source: unknown 30583 1726853701.60277: calling self._execute() 30583 1726853701.60581: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853701.60587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853701.60598: variable 'omit' from source: magic vars 30583 1726853701.61049: variable 'ansible_distribution_major_version' from source: facts 30583 1726853701.61065: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853701.61274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853701.61557: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853701.61609: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853701.61654: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853701.62029: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853701.62155: variable 'network_packages' from source: role '' defaults 30583 1726853701.62276: variable '__network_provider_setup' from source: role '' defaults 30583 1726853701.62404: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853701.62406: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853701.62408: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853701.62427: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853701.62594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853701.64845: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853701.64923: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853701.64963: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853701.65048: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853701.65083: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853701.65189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.65240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.65273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.65318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.65341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.65394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.65437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.65457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.65545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.65549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.65750: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853701.65875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.65904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.65932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.65977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.65996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.66095: variable 'ansible_python' from source: facts 30583 1726853701.66195: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853701.66206: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853701.66289: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853701.66428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.66457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.66488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.66535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.66554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.66605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.66648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.66679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.66721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.66743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.66955: variable 'network_connections' from source: include params 30583 1726853701.66958: variable 'interface' from source: play vars 30583 1726853701.67009: variable 'interface' from source: play vars 30583 1726853701.67091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853701.67122: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853701.67154: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.67195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853701.67246: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853701.67540: variable 'network_connections' from source: include params 30583 1726853701.67550: variable 'interface' from source: play vars 30583 1726853701.67656: variable 'interface' from source: play vars 30583 1726853701.67719: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853701.67824: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853701.68330: variable 'network_connections' from source: include params 30583 1726853701.68477: variable 'interface' from source: play vars 30583 1726853701.68547: variable 'interface' from source: play vars 30583 1726853701.68580: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853701.68859: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853701.69425: variable 'network_connections' from source: include params 30583 1726853701.69437: variable 'interface' from source: play vars 30583 1726853701.69503: variable 'interface' from source: play vars 30583 1726853701.69803: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853701.69912: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853701.69915: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853701.69942: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853701.70476: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853701.71399: variable 'network_connections' from source: include params 30583 1726853701.71409: variable 'interface' from source: play vars 30583 1726853701.71523: variable 'interface' from source: play vars 30583 1726853701.71976: variable 'ansible_distribution' from source: facts 30583 1726853701.71979: variable '__network_rh_distros' from source: role '' defaults 30583 1726853701.71981: variable 'ansible_distribution_major_version' from source: facts 30583 1726853701.71983: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853701.71996: variable 'ansible_distribution' from source: facts 30583 1726853701.72008: variable '__network_rh_distros' from source: role '' defaults 30583 1726853701.72017: variable 'ansible_distribution_major_version' from source: facts 30583 1726853701.72029: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853701.72357: variable 'ansible_distribution' from source: facts 30583 1726853701.72366: variable '__network_rh_distros' from source: role '' defaults 30583 1726853701.72378: variable 'ansible_distribution_major_version' from source: facts 30583 1726853701.72418: variable 'network_provider' from source: set_fact 30583 1726853701.72567: variable 'ansible_facts' from source: unknown 30583 1726853701.74087: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853701.74091: when evaluation is False, skipping this task 30583 1726853701.74093: _execute() done 30583 1726853701.74095: dumping result to json 30583 1726853701.74098: done dumping result, returning 30583 1726853701.74101: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-000000000b3b] 30583 1726853701.74103: sending task result for task 02083763-bbaf-05ea-abc5-000000000b3b 30583 1726853701.74179: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b3b 30583 1726853701.74182: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853701.74239: no more pending results, returning what we have 30583 1726853701.74243: results queue empty 30583 1726853701.74244: checking for any_errors_fatal 30583 1726853701.74250: done checking for any_errors_fatal 30583 1726853701.74250: checking for max_fail_percentage 30583 1726853701.74253: done checking for max_fail_percentage 30583 1726853701.74254: checking to see if all hosts have failed and the running result is not ok 30583 1726853701.74255: done checking to see if all hosts have failed 30583 1726853701.74255: getting the remaining hosts for this loop 30583 1726853701.74257: done getting the remaining hosts for this loop 30583 1726853701.74261: getting the next task for host managed_node2 30583 1726853701.74269: done getting next task for host managed_node2 30583 1726853701.74275: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853701.74280: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853701.74302: getting variables 30583 1726853701.74304: in VariableManager get_vars() 30583 1726853701.74347: Calling all_inventory to load vars for managed_node2 30583 1726853701.74350: Calling groups_inventory to load vars for managed_node2 30583 1726853701.74352: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853701.74364: Calling all_plugins_play to load vars for managed_node2 30583 1726853701.74367: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853701.74373: Calling groups_plugins_play to load vars for managed_node2 30583 1726853701.76057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853701.77615: done with get_vars() 30583 1726853701.77647: done getting variables 30583 1726853701.77708: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:01 -0400 (0:00:00.193) 0:00:37.114 ****** 30583 1726853701.77745: entering _queue_task() for managed_node2/package 30583 1726853701.78112: worker is 1 (out of 1 available) 30583 1726853701.78123: exiting _queue_task() for managed_node2/package 30583 1726853701.78135: done queuing things up, now waiting for results queue to drain 30583 1726853701.78136: waiting for pending results... 30583 1726853701.78434: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853701.78591: in run() - task 02083763-bbaf-05ea-abc5-000000000b3c 30583 1726853701.78611: variable 'ansible_search_path' from source: unknown 30583 1726853701.78619: variable 'ansible_search_path' from source: unknown 30583 1726853701.78663: calling self._execute() 30583 1726853701.78766: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853701.78781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853701.78797: variable 'omit' from source: magic vars 30583 1726853701.79191: variable 'ansible_distribution_major_version' from source: facts 30583 1726853701.79208: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853701.79359: variable 'network_state' from source: role '' defaults 30583 1726853701.79362: Evaluated conditional (network_state != {}): False 30583 1726853701.79365: when evaluation is False, skipping this task 30583 1726853701.79367: _execute() done 30583 1726853701.79369: dumping result to json 30583 1726853701.79378: done dumping result, returning 30583 1726853701.79577: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000000b3c] 30583 1726853701.79580: sending task result for task 02083763-bbaf-05ea-abc5-000000000b3c 30583 1726853701.79651: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b3c 30583 1726853701.79654: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853701.79706: no more pending results, returning what we have 30583 1726853701.79711: results queue empty 30583 1726853701.79712: checking for any_errors_fatal 30583 1726853701.79720: done checking for any_errors_fatal 30583 1726853701.79721: checking for max_fail_percentage 30583 1726853701.79723: done checking for max_fail_percentage 30583 1726853701.79724: checking to see if all hosts have failed and the running result is not ok 30583 1726853701.79725: done checking to see if all hosts have failed 30583 1726853701.79725: getting the remaining hosts for this loop 30583 1726853701.79727: done getting the remaining hosts for this loop 30583 1726853701.79731: getting the next task for host managed_node2 30583 1726853701.79740: done getting next task for host managed_node2 30583 1726853701.79743: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853701.79749: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853701.79776: getting variables 30583 1726853701.79779: in VariableManager get_vars() 30583 1726853701.79819: Calling all_inventory to load vars for managed_node2 30583 1726853701.79822: Calling groups_inventory to load vars for managed_node2 30583 1726853701.79824: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853701.79837: Calling all_plugins_play to load vars for managed_node2 30583 1726853701.79840: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853701.79843: Calling groups_plugins_play to load vars for managed_node2 30583 1726853701.81299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853701.82949: done with get_vars() 30583 1726853701.82975: done getting variables 30583 1726853701.83036: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:01 -0400 (0:00:00.053) 0:00:37.167 ****** 30583 1726853701.83074: entering _queue_task() for managed_node2/package 30583 1726853701.83441: worker is 1 (out of 1 available) 30583 1726853701.83455: exiting _queue_task() for managed_node2/package 30583 1726853701.83468: done queuing things up, now waiting for results queue to drain 30583 1726853701.83469: waiting for pending results... 30583 1726853701.84102: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853701.84453: in run() - task 02083763-bbaf-05ea-abc5-000000000b3d 30583 1726853701.84457: variable 'ansible_search_path' from source: unknown 30583 1726853701.84460: variable 'ansible_search_path' from source: unknown 30583 1726853701.84508: calling self._execute() 30583 1726853701.84737: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853701.84741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853701.84752: variable 'omit' from source: magic vars 30583 1726853701.85515: variable 'ansible_distribution_major_version' from source: facts 30583 1726853701.85527: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853701.85766: variable 'network_state' from source: role '' defaults 30583 1726853701.85907: Evaluated conditional (network_state != {}): False 30583 1726853701.85911: when evaluation is False, skipping this task 30583 1726853701.85914: _execute() done 30583 1726853701.85916: dumping result to json 30583 1726853701.85919: done dumping result, returning 30583 1726853701.86077: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000000b3d] 30583 1726853701.86081: sending task result for task 02083763-bbaf-05ea-abc5-000000000b3d 30583 1726853701.86152: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b3d 30583 1726853701.86154: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853701.86205: no more pending results, returning what we have 30583 1726853701.86210: results queue empty 30583 1726853701.86211: checking for any_errors_fatal 30583 1726853701.86217: done checking for any_errors_fatal 30583 1726853701.86218: checking for max_fail_percentage 30583 1726853701.86220: done checking for max_fail_percentage 30583 1726853701.86221: checking to see if all hosts have failed and the running result is not ok 30583 1726853701.86222: done checking to see if all hosts have failed 30583 1726853701.86222: getting the remaining hosts for this loop 30583 1726853701.86224: done getting the remaining hosts for this loop 30583 1726853701.86228: getting the next task for host managed_node2 30583 1726853701.86237: done getting next task for host managed_node2 30583 1726853701.86242: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853701.86247: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853701.86272: getting variables 30583 1726853701.86274: in VariableManager get_vars() 30583 1726853701.86312: Calling all_inventory to load vars for managed_node2 30583 1726853701.86314: Calling groups_inventory to load vars for managed_node2 30583 1726853701.86316: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853701.86328: Calling all_plugins_play to load vars for managed_node2 30583 1726853701.86331: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853701.86333: Calling groups_plugins_play to load vars for managed_node2 30583 1726853701.89036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853701.91040: done with get_vars() 30583 1726853701.91077: done getting variables 30583 1726853701.91138: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:01 -0400 (0:00:00.081) 0:00:37.248 ****** 30583 1726853701.91178: entering _queue_task() for managed_node2/service 30583 1726853701.91552: worker is 1 (out of 1 available) 30583 1726853701.91565: exiting _queue_task() for managed_node2/service 30583 1726853701.91778: done queuing things up, now waiting for results queue to drain 30583 1726853701.91780: waiting for pending results... 30583 1726853701.91910: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853701.92115: in run() - task 02083763-bbaf-05ea-abc5-000000000b3e 30583 1726853701.92119: variable 'ansible_search_path' from source: unknown 30583 1726853701.92121: variable 'ansible_search_path' from source: unknown 30583 1726853701.92124: calling self._execute() 30583 1726853701.92198: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853701.92211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853701.92230: variable 'omit' from source: magic vars 30583 1726853701.92612: variable 'ansible_distribution_major_version' from source: facts 30583 1726853701.92632: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853701.92767: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853701.92969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853701.95175: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853701.95604: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853701.95696: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853701.95700: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853701.95722: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853701.95807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.95844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.95876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.95924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.95975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.95994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.96025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.96052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.96096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.96116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.96165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853701.96237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853701.96241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.96262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853701.96283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853701.96466: variable 'network_connections' from source: include params 30583 1726853701.96485: variable 'interface' from source: play vars 30583 1726853701.96561: variable 'interface' from source: play vars 30583 1726853701.96643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853701.96976: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853701.96980: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853701.96982: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853701.96984: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853701.96999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853701.97025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853701.97055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853701.97088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853701.97161: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853701.97410: variable 'network_connections' from source: include params 30583 1726853701.97424: variable 'interface' from source: play vars 30583 1726853701.97491: variable 'interface' from source: play vars 30583 1726853701.97534: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853701.97543: when evaluation is False, skipping this task 30583 1726853701.97551: _execute() done 30583 1726853701.97558: dumping result to json 30583 1726853701.97566: done dumping result, returning 30583 1726853701.97582: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000b3e] 30583 1726853701.97592: sending task result for task 02083763-bbaf-05ea-abc5-000000000b3e 30583 1726853701.97880: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b3e 30583 1726853701.97892: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853701.97940: no more pending results, returning what we have 30583 1726853701.97944: results queue empty 30583 1726853701.97945: checking for any_errors_fatal 30583 1726853701.97951: done checking for any_errors_fatal 30583 1726853701.97952: checking for max_fail_percentage 30583 1726853701.97954: done checking for max_fail_percentage 30583 1726853701.97955: checking to see if all hosts have failed and the running result is not ok 30583 1726853701.97956: done checking to see if all hosts have failed 30583 1726853701.97957: getting the remaining hosts for this loop 30583 1726853701.97959: done getting the remaining hosts for this loop 30583 1726853701.97963: getting the next task for host managed_node2 30583 1726853701.97974: done getting next task for host managed_node2 30583 1726853701.97979: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853701.97984: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853701.98007: getting variables 30583 1726853701.98009: in VariableManager get_vars() 30583 1726853701.98049: Calling all_inventory to load vars for managed_node2 30583 1726853701.98052: Calling groups_inventory to load vars for managed_node2 30583 1726853701.98055: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853701.98065: Calling all_plugins_play to load vars for managed_node2 30583 1726853701.98069: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853701.98277: Calling groups_plugins_play to load vars for managed_node2 30583 1726853702.00594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853702.03717: done with get_vars() 30583 1726853702.03752: done getting variables 30583 1726853702.03824: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:02 -0400 (0:00:00.126) 0:00:37.375 ****** 30583 1726853702.03870: entering _queue_task() for managed_node2/service 30583 1726853702.04299: worker is 1 (out of 1 available) 30583 1726853702.04311: exiting _queue_task() for managed_node2/service 30583 1726853702.04326: done queuing things up, now waiting for results queue to drain 30583 1726853702.04328: waiting for pending results... 30583 1726853702.04633: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853702.04786: in run() - task 02083763-bbaf-05ea-abc5-000000000b3f 30583 1726853702.04801: variable 'ansible_search_path' from source: unknown 30583 1726853702.04805: variable 'ansible_search_path' from source: unknown 30583 1726853702.04840: calling self._execute() 30583 1726853702.04947: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853702.04951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853702.05076: variable 'omit' from source: magic vars 30583 1726853702.05387: variable 'ansible_distribution_major_version' from source: facts 30583 1726853702.05398: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853702.05588: variable 'network_provider' from source: set_fact 30583 1726853702.05591: variable 'network_state' from source: role '' defaults 30583 1726853702.05602: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853702.05608: variable 'omit' from source: magic vars 30583 1726853702.05685: variable 'omit' from source: magic vars 30583 1726853702.05715: variable 'network_service_name' from source: role '' defaults 30583 1726853702.05792: variable 'network_service_name' from source: role '' defaults 30583 1726853702.06025: variable '__network_provider_setup' from source: role '' defaults 30583 1726853702.06030: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853702.06166: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853702.06224: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853702.06594: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853702.06877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853702.10328: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853702.10428: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853702.10467: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853702.10511: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853702.10538: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853702.10729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853702.10733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853702.10736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853702.10739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853702.10741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853702.10789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853702.10815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853702.10837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853702.10881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853702.10894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853702.11132: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853702.11258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853702.11288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853702.11311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853702.11345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853702.11368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853702.11457: variable 'ansible_python' from source: facts 30583 1726853702.11485: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853702.11565: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853702.11645: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853702.11773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853702.11803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853702.11825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853702.11867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853702.11882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853702.11933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853702.11962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853702.12030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853702.12033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853702.12048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853702.12198: variable 'network_connections' from source: include params 30583 1726853702.12205: variable 'interface' from source: play vars 30583 1726853702.12291: variable 'interface' from source: play vars 30583 1726853702.12413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853702.12908: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853702.13050: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853702.13095: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853702.13252: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853702.13393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853702.13420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853702.13568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853702.13601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853702.13675: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853702.14395: variable 'network_connections' from source: include params 30583 1726853702.14402: variable 'interface' from source: play vars 30583 1726853702.14512: variable 'interface' from source: play vars 30583 1726853702.14630: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853702.14651: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853702.14989: variable 'network_connections' from source: include params 30583 1726853702.14994: variable 'interface' from source: play vars 30583 1726853702.15065: variable 'interface' from source: play vars 30583 1726853702.15099: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853702.15179: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853702.15966: variable 'network_connections' from source: include params 30583 1726853702.15970: variable 'interface' from source: play vars 30583 1726853702.15975: variable 'interface' from source: play vars 30583 1726853702.15977: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853702.16039: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853702.16045: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853702.16113: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853702.16356: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853702.16952: variable 'network_connections' from source: include params 30583 1726853702.16955: variable 'interface' from source: play vars 30583 1726853702.17028: variable 'interface' from source: play vars 30583 1726853702.17376: variable 'ansible_distribution' from source: facts 30583 1726853702.17379: variable '__network_rh_distros' from source: role '' defaults 30583 1726853702.17382: variable 'ansible_distribution_major_version' from source: facts 30583 1726853702.17384: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853702.17386: variable 'ansible_distribution' from source: facts 30583 1726853702.17388: variable '__network_rh_distros' from source: role '' defaults 30583 1726853702.17390: variable 'ansible_distribution_major_version' from source: facts 30583 1726853702.17392: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853702.17495: variable 'ansible_distribution' from source: facts 30583 1726853702.17499: variable '__network_rh_distros' from source: role '' defaults 30583 1726853702.17507: variable 'ansible_distribution_major_version' from source: facts 30583 1726853702.17745: variable 'network_provider' from source: set_fact 30583 1726853702.17748: variable 'omit' from source: magic vars 30583 1726853702.17750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853702.17753: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853702.17755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853702.17757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853702.17759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853702.17813: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853702.17816: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853702.17838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853702.17960: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853702.17968: Set connection var ansible_timeout to 10 30583 1726853702.17975: Set connection var ansible_connection to ssh 30583 1726853702.17992: Set connection var ansible_shell_executable to /bin/sh 30583 1726853702.17995: Set connection var ansible_shell_type to sh 30583 1726853702.18053: Set connection var ansible_pipelining to False 30583 1726853702.18056: variable 'ansible_shell_executable' from source: unknown 30583 1726853702.18059: variable 'ansible_connection' from source: unknown 30583 1726853702.18061: variable 'ansible_module_compression' from source: unknown 30583 1726853702.18063: variable 'ansible_shell_type' from source: unknown 30583 1726853702.18065: variable 'ansible_shell_executable' from source: unknown 30583 1726853702.18070: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853702.18086: variable 'ansible_pipelining' from source: unknown 30583 1726853702.18088: variable 'ansible_timeout' from source: unknown 30583 1726853702.18132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853702.18511: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853702.18523: variable 'omit' from source: magic vars 30583 1726853702.18529: starting attempt loop 30583 1726853702.18531: running the handler 30583 1726853702.18842: variable 'ansible_facts' from source: unknown 30583 1726853702.21498: _low_level_execute_command(): starting 30583 1726853702.21506: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853702.22930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853702.23093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853702.23210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853702.23289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853702.23367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853702.25132: stdout chunk (state=3): >>>/root <<< 30583 1726853702.25217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853702.25310: stderr chunk (state=3): >>><<< 30583 1726853702.25313: stdout chunk (state=3): >>><<< 30583 1726853702.25428: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853702.25433: _low_level_execute_command(): starting 30583 1726853702.25439: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715 `" && echo ansible-tmp-1726853702.2533798-32342-58595590538715="` echo /root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715 `" ) && sleep 0' 30583 1726853702.26082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853702.26106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853702.26125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853702.26142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853702.26162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853702.26231: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853702.26282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853702.26301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853702.26402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853702.28393: stdout chunk (state=3): >>>ansible-tmp-1726853702.2533798-32342-58595590538715=/root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715 <<< 30583 1726853702.28545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853702.28665: stdout chunk (state=3): >>><<< 30583 1726853702.28669: stderr chunk (state=3): >>><<< 30583 1726853702.28674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853702.2533798-32342-58595590538715=/root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853702.28678: variable 'ansible_module_compression' from source: unknown 30583 1726853702.28721: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853702.28806: variable 'ansible_facts' from source: unknown 30583 1726853702.29038: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715/AnsiballZ_systemd.py 30583 1726853702.29284: Sending initial data 30583 1726853702.29292: Sent initial data (155 bytes) 30583 1726853702.29866: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853702.29888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853702.29911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853702.29929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853702.29953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853702.30016: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853702.30032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853702.30102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853702.30185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853702.30223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853702.30254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853702.30362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853702.32034: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853702.32105: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853702.32194: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpxv1m5uf2 /root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715/AnsiballZ_systemd.py <<< 30583 1726853702.32198: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715/AnsiballZ_systemd.py" <<< 30583 1726853702.32280: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpxv1m5uf2" to remote "/root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715/AnsiballZ_systemd.py" <<< 30583 1726853702.34520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853702.34581: stderr chunk (state=3): >>><<< 30583 1726853702.34584: stdout chunk (state=3): >>><<< 30583 1726853702.34586: done transferring module to remote 30583 1726853702.34591: _low_level_execute_command(): starting 30583 1726853702.34597: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715/ /root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715/AnsiballZ_systemd.py && sleep 0' 30583 1726853702.35014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853702.35018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853702.35051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853702.35057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853702.35105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853702.35109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853702.35187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853702.37103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853702.37127: stderr chunk (state=3): >>><<< 30583 1726853702.37130: stdout chunk (state=3): >>><<< 30583 1726853702.37141: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853702.37144: _low_level_execute_command(): starting 30583 1726853702.37149: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715/AnsiballZ_systemd.py && sleep 0' 30583 1726853702.37590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853702.37593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853702.37596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853702.37598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853702.37645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853702.37657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853702.37731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853702.67599: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4595712", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3308113920", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1809100000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 30583 1726853702.67607: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853702.69568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853702.69632: stderr chunk (state=3): >>><<< 30583 1726853702.69636: stdout chunk (state=3): >>><<< 30583 1726853702.69647: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4595712", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3308113920", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1809100000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853702.69785: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853702.69801: _low_level_execute_command(): starting 30583 1726853702.69805: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853702.2533798-32342-58595590538715/ > /dev/null 2>&1 && sleep 0' 30583 1726853702.70447: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853702.70451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853702.70453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853702.70455: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853702.70457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853702.70468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853702.70517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853702.70594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853702.72526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853702.72553: stderr chunk (state=3): >>><<< 30583 1726853702.72559: stdout chunk (state=3): >>><<< 30583 1726853702.72577: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853702.72586: handler run complete 30583 1726853702.72623: attempt loop complete, returning result 30583 1726853702.72626: _execute() done 30583 1726853702.72628: dumping result to json 30583 1726853702.72641: done dumping result, returning 30583 1726853702.72649: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-000000000b3f] 30583 1726853702.72653: sending task result for task 02083763-bbaf-05ea-abc5-000000000b3f 30583 1726853702.72886: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b3f 30583 1726853702.72889: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853702.72944: no more pending results, returning what we have 30583 1726853702.72947: results queue empty 30583 1726853702.72948: checking for any_errors_fatal 30583 1726853702.72955: done checking for any_errors_fatal 30583 1726853702.72955: checking for max_fail_percentage 30583 1726853702.72957: done checking for max_fail_percentage 30583 1726853702.72958: checking to see if all hosts have failed and the running result is not ok 30583 1726853702.72959: done checking to see if all hosts have failed 30583 1726853702.72960: getting the remaining hosts for this loop 30583 1726853702.72962: done getting the remaining hosts for this loop 30583 1726853702.72965: getting the next task for host managed_node2 30583 1726853702.72974: done getting next task for host managed_node2 30583 1726853702.72977: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853702.72983: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853702.72994: getting variables 30583 1726853702.72996: in VariableManager get_vars() 30583 1726853702.73026: Calling all_inventory to load vars for managed_node2 30583 1726853702.73028: Calling groups_inventory to load vars for managed_node2 30583 1726853702.73030: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853702.73040: Calling all_plugins_play to load vars for managed_node2 30583 1726853702.73042: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853702.73045: Calling groups_plugins_play to load vars for managed_node2 30583 1726853702.73844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853702.74821: done with get_vars() 30583 1726853702.74835: done getting variables 30583 1726853702.74883: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:35:02 -0400 (0:00:00.710) 0:00:38.086 ****** 30583 1726853702.74912: entering _queue_task() for managed_node2/service 30583 1726853702.75161: worker is 1 (out of 1 available) 30583 1726853702.75176: exiting _queue_task() for managed_node2/service 30583 1726853702.75189: done queuing things up, now waiting for results queue to drain 30583 1726853702.75190: waiting for pending results... 30583 1726853702.75376: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853702.75463: in run() - task 02083763-bbaf-05ea-abc5-000000000b40 30583 1726853702.75473: variable 'ansible_search_path' from source: unknown 30583 1726853702.75477: variable 'ansible_search_path' from source: unknown 30583 1726853702.75505: calling self._execute() 30583 1726853702.75579: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853702.75582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853702.75591: variable 'omit' from source: magic vars 30583 1726853702.75875: variable 'ansible_distribution_major_version' from source: facts 30583 1726853702.75884: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853702.75968: variable 'network_provider' from source: set_fact 30583 1726853702.75973: Evaluated conditional (network_provider == "nm"): True 30583 1726853702.76032: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853702.76099: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853702.76217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853702.77652: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853702.77700: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853702.77729: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853702.77757: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853702.77776: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853702.77844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853702.77865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853702.77883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853702.77908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853702.77919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853702.77957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853702.77973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853702.77989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853702.78013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853702.78024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853702.78057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853702.78074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853702.78091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853702.78114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853702.78124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853702.78223: variable 'network_connections' from source: include params 30583 1726853702.78233: variable 'interface' from source: play vars 30583 1726853702.78285: variable 'interface' from source: play vars 30583 1726853702.78333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853702.78440: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853702.78470: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853702.78494: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853702.78515: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853702.78543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853702.78560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853702.78577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853702.78600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853702.78642: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853702.78795: variable 'network_connections' from source: include params 30583 1726853702.78801: variable 'interface' from source: play vars 30583 1726853702.78841: variable 'interface' from source: play vars 30583 1726853702.78870: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853702.78875: when evaluation is False, skipping this task 30583 1726853702.78878: _execute() done 30583 1726853702.78881: dumping result to json 30583 1726853702.78883: done dumping result, returning 30583 1726853702.78890: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-000000000b40] 30583 1726853702.78902: sending task result for task 02083763-bbaf-05ea-abc5-000000000b40 30583 1726853702.78989: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b40 30583 1726853702.78992: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853702.79119: no more pending results, returning what we have 30583 1726853702.79122: results queue empty 30583 1726853702.79123: checking for any_errors_fatal 30583 1726853702.79145: done checking for any_errors_fatal 30583 1726853702.79146: checking for max_fail_percentage 30583 1726853702.79148: done checking for max_fail_percentage 30583 1726853702.79148: checking to see if all hosts have failed and the running result is not ok 30583 1726853702.79149: done checking to see if all hosts have failed 30583 1726853702.79150: getting the remaining hosts for this loop 30583 1726853702.79151: done getting the remaining hosts for this loop 30583 1726853702.79156: getting the next task for host managed_node2 30583 1726853702.79163: done getting next task for host managed_node2 30583 1726853702.79167: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853702.79172: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853702.79192: getting variables 30583 1726853702.79193: in VariableManager get_vars() 30583 1726853702.79223: Calling all_inventory to load vars for managed_node2 30583 1726853702.79226: Calling groups_inventory to load vars for managed_node2 30583 1726853702.79228: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853702.79235: Calling all_plugins_play to load vars for managed_node2 30583 1726853702.79238: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853702.79240: Calling groups_plugins_play to load vars for managed_node2 30583 1726853702.80439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853702.81320: done with get_vars() 30583 1726853702.81339: done getting variables 30583 1726853702.81383: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:35:02 -0400 (0:00:00.064) 0:00:38.151 ****** 30583 1726853702.81406: entering _queue_task() for managed_node2/service 30583 1726853702.81646: worker is 1 (out of 1 available) 30583 1726853702.81660: exiting _queue_task() for managed_node2/service 30583 1726853702.81678: done queuing things up, now waiting for results queue to drain 30583 1726853702.81680: waiting for pending results... 30583 1726853702.81859: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853702.81954: in run() - task 02083763-bbaf-05ea-abc5-000000000b41 30583 1726853702.82013: variable 'ansible_search_path' from source: unknown 30583 1726853702.82017: variable 'ansible_search_path' from source: unknown 30583 1726853702.82021: calling self._execute() 30583 1726853702.82139: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853702.82144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853702.82147: variable 'omit' from source: magic vars 30583 1726853702.82576: variable 'ansible_distribution_major_version' from source: facts 30583 1726853702.82579: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853702.82615: variable 'network_provider' from source: set_fact 30583 1726853702.82629: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853702.82639: when evaluation is False, skipping this task 30583 1726853702.82646: _execute() done 30583 1726853702.82653: dumping result to json 30583 1726853702.82661: done dumping result, returning 30583 1726853702.82675: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-000000000b41] 30583 1726853702.82686: sending task result for task 02083763-bbaf-05ea-abc5-000000000b41 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853702.82837: no more pending results, returning what we have 30583 1726853702.82841: results queue empty 30583 1726853702.82842: checking for any_errors_fatal 30583 1726853702.82850: done checking for any_errors_fatal 30583 1726853702.82850: checking for max_fail_percentage 30583 1726853702.82852: done checking for max_fail_percentage 30583 1726853702.82853: checking to see if all hosts have failed and the running result is not ok 30583 1726853702.82854: done checking to see if all hosts have failed 30583 1726853702.82854: getting the remaining hosts for this loop 30583 1726853702.82858: done getting the remaining hosts for this loop 30583 1726853702.82862: getting the next task for host managed_node2 30583 1726853702.82870: done getting next task for host managed_node2 30583 1726853702.82876: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853702.82977: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853702.83015: getting variables 30583 1726853702.83017: in VariableManager get_vars() 30583 1726853702.83049: Calling all_inventory to load vars for managed_node2 30583 1726853702.83051: Calling groups_inventory to load vars for managed_node2 30583 1726853702.83053: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853702.83065: Calling all_plugins_play to load vars for managed_node2 30583 1726853702.83067: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853702.83111: Calling groups_plugins_play to load vars for managed_node2 30583 1726853702.83124: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b41 30583 1726853702.83127: WORKER PROCESS EXITING 30583 1726853702.84170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853702.85035: done with get_vars() 30583 1726853702.85050: done getting variables 30583 1726853702.85097: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:35:02 -0400 (0:00:00.037) 0:00:38.188 ****** 30583 1726853702.85123: entering _queue_task() for managed_node2/copy 30583 1726853702.85598: worker is 1 (out of 1 available) 30583 1726853702.85608: exiting _queue_task() for managed_node2/copy 30583 1726853702.85619: done queuing things up, now waiting for results queue to drain 30583 1726853702.85620: waiting for pending results... 30583 1726853702.85704: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853702.85864: in run() - task 02083763-bbaf-05ea-abc5-000000000b42 30583 1726853702.85885: variable 'ansible_search_path' from source: unknown 30583 1726853702.85953: variable 'ansible_search_path' from source: unknown 30583 1726853702.85960: calling self._execute() 30583 1726853702.86033: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853702.86044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853702.86063: variable 'omit' from source: magic vars 30583 1726853702.86442: variable 'ansible_distribution_major_version' from source: facts 30583 1726853702.86463: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853702.86589: variable 'network_provider' from source: set_fact 30583 1726853702.86604: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853702.86612: when evaluation is False, skipping this task 30583 1726853702.86619: _execute() done 30583 1726853702.86626: dumping result to json 30583 1726853702.86711: done dumping result, returning 30583 1726853702.86716: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-000000000b42] 30583 1726853702.86718: sending task result for task 02083763-bbaf-05ea-abc5-000000000b42 30583 1726853702.86795: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b42 30583 1726853702.86798: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853702.86865: no more pending results, returning what we have 30583 1726853702.86870: results queue empty 30583 1726853702.86872: checking for any_errors_fatal 30583 1726853702.86878: done checking for any_errors_fatal 30583 1726853702.86879: checking for max_fail_percentage 30583 1726853702.86881: done checking for max_fail_percentage 30583 1726853702.86882: checking to see if all hosts have failed and the running result is not ok 30583 1726853702.86883: done checking to see if all hosts have failed 30583 1726853702.86884: getting the remaining hosts for this loop 30583 1726853702.86886: done getting the remaining hosts for this loop 30583 1726853702.86890: getting the next task for host managed_node2 30583 1726853702.86899: done getting next task for host managed_node2 30583 1726853702.86903: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853702.86909: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853702.86932: getting variables 30583 1726853702.86934: in VariableManager get_vars() 30583 1726853702.87078: Calling all_inventory to load vars for managed_node2 30583 1726853702.87082: Calling groups_inventory to load vars for managed_node2 30583 1726853702.87085: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853702.87097: Calling all_plugins_play to load vars for managed_node2 30583 1726853702.87101: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853702.87104: Calling groups_plugins_play to load vars for managed_node2 30583 1726853702.88609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853702.90221: done with get_vars() 30583 1726853702.90245: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:35:02 -0400 (0:00:00.052) 0:00:38.240 ****** 30583 1726853702.90334: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853702.90898: worker is 1 (out of 1 available) 30583 1726853702.90907: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853702.90917: done queuing things up, now waiting for results queue to drain 30583 1726853702.90918: waiting for pending results... 30583 1726853702.91049: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853702.91119: in run() - task 02083763-bbaf-05ea-abc5-000000000b43 30583 1726853702.91143: variable 'ansible_search_path' from source: unknown 30583 1726853702.91173: variable 'ansible_search_path' from source: unknown 30583 1726853702.91217: calling self._execute() 30583 1726853702.91329: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853702.91360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853702.91364: variable 'omit' from source: magic vars 30583 1726853702.91720: variable 'ansible_distribution_major_version' from source: facts 30583 1726853702.91735: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853702.91794: variable 'omit' from source: magic vars 30583 1726853702.91814: variable 'omit' from source: magic vars 30583 1726853702.91976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853702.95381: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853702.95702: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853702.95800: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853702.95904: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853702.95907: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853702.96086: variable 'network_provider' from source: set_fact 30583 1726853702.96366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853702.96680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853702.96683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853702.96689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853702.96809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853702.97034: variable 'omit' from source: magic vars 30583 1726853702.97257: variable 'omit' from source: magic vars 30583 1726853702.97593: variable 'network_connections' from source: include params 30583 1726853702.97608: variable 'interface' from source: play vars 30583 1726853702.97745: variable 'interface' from source: play vars 30583 1726853702.98245: variable 'omit' from source: magic vars 30583 1726853702.98248: variable '__lsr_ansible_managed' from source: task vars 30583 1726853702.98280: variable '__lsr_ansible_managed' from source: task vars 30583 1726853702.98661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853702.99180: Loaded config def from plugin (lookup/template) 30583 1726853702.99190: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853702.99251: File lookup term: get_ansible_managed.j2 30583 1726853702.99262: variable 'ansible_search_path' from source: unknown 30583 1726853702.99296: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853702.99316: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853702.99546: variable 'ansible_search_path' from source: unknown 30583 1726853703.09113: variable 'ansible_managed' from source: unknown 30583 1726853703.09215: variable 'omit' from source: magic vars 30583 1726853703.09236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853703.09256: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853703.09276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853703.09290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853703.09298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853703.09320: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853703.09323: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853703.09328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853703.09393: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853703.09400: Set connection var ansible_timeout to 10 30583 1726853703.09402: Set connection var ansible_connection to ssh 30583 1726853703.09407: Set connection var ansible_shell_executable to /bin/sh 30583 1726853703.09409: Set connection var ansible_shell_type to sh 30583 1726853703.09417: Set connection var ansible_pipelining to False 30583 1726853703.09438: variable 'ansible_shell_executable' from source: unknown 30583 1726853703.09441: variable 'ansible_connection' from source: unknown 30583 1726853703.09443: variable 'ansible_module_compression' from source: unknown 30583 1726853703.09446: variable 'ansible_shell_type' from source: unknown 30583 1726853703.09448: variable 'ansible_shell_executable' from source: unknown 30583 1726853703.09451: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853703.09453: variable 'ansible_pipelining' from source: unknown 30583 1726853703.09489: variable 'ansible_timeout' from source: unknown 30583 1726853703.09492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853703.09566: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853703.09579: variable 'omit' from source: magic vars 30583 1726853703.09591: starting attempt loop 30583 1726853703.09594: running the handler 30583 1726853703.09622: _low_level_execute_command(): starting 30583 1726853703.09625: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853703.10379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853703.10385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.10387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853703.10389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853703.10391: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.10496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853703.10499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853703.10605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853703.12300: stdout chunk (state=3): >>>/root <<< 30583 1726853703.12423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853703.12444: stderr chunk (state=3): >>><<< 30583 1726853703.12447: stdout chunk (state=3): >>><<< 30583 1726853703.12476: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853703.12482: _low_level_execute_command(): starting 30583 1726853703.12489: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508 `" && echo ansible-tmp-1726853703.1246889-32377-153882576375508="` echo /root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508 `" ) && sleep 0' 30583 1726853703.12918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.12922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853703.12925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853703.12927: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.12929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.12982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853703.12987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853703.12989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853703.13061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853703.15110: stdout chunk (state=3): >>>ansible-tmp-1726853703.1246889-32377-153882576375508=/root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508 <<< 30583 1726853703.15240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853703.15243: stdout chunk (state=3): >>><<< 30583 1726853703.15245: stderr chunk (state=3): >>><<< 30583 1726853703.15342: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853703.1246889-32377-153882576375508=/root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853703.15347: variable 'ansible_module_compression' from source: unknown 30583 1726853703.15350: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853703.15405: variable 'ansible_facts' from source: unknown 30583 1726853703.15564: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508/AnsiballZ_network_connections.py 30583 1726853703.15686: Sending initial data 30583 1726853703.15690: Sent initial data (168 bytes) 30583 1726853703.16118: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.16122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.16128: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853703.16130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.16141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.16184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853703.16187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853703.16261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853703.17902: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30583 1726853703.17907: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853703.17974: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853703.18044: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp228ryzyw /root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508/AnsiballZ_network_connections.py <<< 30583 1726853703.18051: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508/AnsiballZ_network_connections.py" <<< 30583 1726853703.18138: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp228ryzyw" to remote "/root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508/AnsiballZ_network_connections.py" <<< 30583 1726853703.19297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853703.19331: stderr chunk (state=3): >>><<< 30583 1726853703.19334: stdout chunk (state=3): >>><<< 30583 1726853703.19353: done transferring module to remote 30583 1726853703.19366: _low_level_execute_command(): starting 30583 1726853703.19369: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508/ /root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508/AnsiballZ_network_connections.py && sleep 0' 30583 1726853703.19815: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.19819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.19821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.19823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.19874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853703.19877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853703.19882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853703.19949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853703.21889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853703.21892: stdout chunk (state=3): >>><<< 30583 1726853703.21894: stderr chunk (state=3): >>><<< 30583 1726853703.21911: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853703.21919: _low_level_execute_command(): starting 30583 1726853703.21927: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508/AnsiballZ_network_connections.py && sleep 0' 30583 1726853703.22522: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853703.22526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.22529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.22567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.22611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853703.22629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853703.22696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853703.51344: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30583 1726853703.55386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853703.55390: stdout chunk (state=3): >>><<< 30583 1726853703.55392: stderr chunk (state=3): >>><<< 30583 1726853703.55597: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853703.55635: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853703.55656: _low_level_execute_command(): starting 30583 1726853703.55659: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853703.1246889-32377-153882576375508/ > /dev/null 2>&1 && sleep 0' 30583 1726853703.56844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853703.56848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.56850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853703.56956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853703.56959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.56961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853703.56988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853703.56993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853703.57108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853703.59107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853703.59137: stderr chunk (state=3): >>><<< 30583 1726853703.59377: stdout chunk (state=3): >>><<< 30583 1726853703.59381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853703.59387: handler run complete 30583 1726853703.59392: attempt loop complete, returning result 30583 1726853703.59394: _execute() done 30583 1726853703.59396: dumping result to json 30583 1726853703.59398: done dumping result, returning 30583 1726853703.59400: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-000000000b43] 30583 1726853703.59406: sending task result for task 02083763-bbaf-05ea-abc5-000000000b43 30583 1726853703.59496: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b43 30583 1726853703.59499: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72 30583 1726853703.59629: no more pending results, returning what we have 30583 1726853703.59633: results queue empty 30583 1726853703.59634: checking for any_errors_fatal 30583 1726853703.59640: done checking for any_errors_fatal 30583 1726853703.59640: checking for max_fail_percentage 30583 1726853703.59647: done checking for max_fail_percentage 30583 1726853703.59648: checking to see if all hosts have failed and the running result is not ok 30583 1726853703.59648: done checking to see if all hosts have failed 30583 1726853703.59652: getting the remaining hosts for this loop 30583 1726853703.59655: done getting the remaining hosts for this loop 30583 1726853703.59659: getting the next task for host managed_node2 30583 1726853703.59667: done getting next task for host managed_node2 30583 1726853703.59899: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853703.59906: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853703.59927: getting variables 30583 1726853703.59929: in VariableManager get_vars() 30583 1726853703.60098: Calling all_inventory to load vars for managed_node2 30583 1726853703.60102: Calling groups_inventory to load vars for managed_node2 30583 1726853703.60105: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853703.60115: Calling all_plugins_play to load vars for managed_node2 30583 1726853703.60118: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853703.60122: Calling groups_plugins_play to load vars for managed_node2 30583 1726853703.63948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853703.66388: done with get_vars() 30583 1726853703.66489: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:35:03 -0400 (0:00:00.763) 0:00:39.003 ****** 30583 1726853703.66664: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853703.67307: worker is 1 (out of 1 available) 30583 1726853703.67317: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853703.67330: done queuing things up, now waiting for results queue to drain 30583 1726853703.67331: waiting for pending results... 30583 1726853703.67708: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853703.67964: in run() - task 02083763-bbaf-05ea-abc5-000000000b44 30583 1726853703.67968: variable 'ansible_search_path' from source: unknown 30583 1726853703.67973: variable 'ansible_search_path' from source: unknown 30583 1726853703.68022: calling self._execute() 30583 1726853703.68228: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853703.68234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853703.68241: variable 'omit' from source: magic vars 30583 1726853703.69025: variable 'ansible_distribution_major_version' from source: facts 30583 1726853703.69029: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853703.69301: variable 'network_state' from source: role '' defaults 30583 1726853703.69319: Evaluated conditional (network_state != {}): False 30583 1726853703.69327: when evaluation is False, skipping this task 30583 1726853703.69334: _execute() done 30583 1726853703.69340: dumping result to json 30583 1726853703.69354: done dumping result, returning 30583 1726853703.69369: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-000000000b44] 30583 1726853703.69379: sending task result for task 02083763-bbaf-05ea-abc5-000000000b44 30583 1726853703.69539: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b44 30583 1726853703.69542: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853703.69621: no more pending results, returning what we have 30583 1726853703.69626: results queue empty 30583 1726853703.69627: checking for any_errors_fatal 30583 1726853703.69640: done checking for any_errors_fatal 30583 1726853703.69640: checking for max_fail_percentage 30583 1726853703.69642: done checking for max_fail_percentage 30583 1726853703.69643: checking to see if all hosts have failed and the running result is not ok 30583 1726853703.69644: done checking to see if all hosts have failed 30583 1726853703.69645: getting the remaining hosts for this loop 30583 1726853703.69647: done getting the remaining hosts for this loop 30583 1726853703.69651: getting the next task for host managed_node2 30583 1726853703.69659: done getting next task for host managed_node2 30583 1726853703.69663: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853703.69670: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853703.69801: getting variables 30583 1726853703.69803: in VariableManager get_vars() 30583 1726853703.69845: Calling all_inventory to load vars for managed_node2 30583 1726853703.69849: Calling groups_inventory to load vars for managed_node2 30583 1726853703.69851: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853703.69863: Calling all_plugins_play to load vars for managed_node2 30583 1726853703.69867: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853703.69870: Calling groups_plugins_play to load vars for managed_node2 30583 1726853703.71379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853703.72949: done with get_vars() 30583 1726853703.72981: done getting variables 30583 1726853703.73045: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:35:03 -0400 (0:00:00.064) 0:00:39.068 ****** 30583 1726853703.73085: entering _queue_task() for managed_node2/debug 30583 1726853703.73679: worker is 1 (out of 1 available) 30583 1726853703.73689: exiting _queue_task() for managed_node2/debug 30583 1726853703.73698: done queuing things up, now waiting for results queue to drain 30583 1726853703.73699: waiting for pending results... 30583 1726853703.73829: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853703.73939: in run() - task 02083763-bbaf-05ea-abc5-000000000b45 30583 1726853703.73943: variable 'ansible_search_path' from source: unknown 30583 1726853703.73946: variable 'ansible_search_path' from source: unknown 30583 1726853703.74034: calling self._execute() 30583 1726853703.74088: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853703.74099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853703.74112: variable 'omit' from source: magic vars 30583 1726853703.74502: variable 'ansible_distribution_major_version' from source: facts 30583 1726853703.74520: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853703.74532: variable 'omit' from source: magic vars 30583 1726853703.74605: variable 'omit' from source: magic vars 30583 1726853703.74644: variable 'omit' from source: magic vars 30583 1726853703.74788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853703.74792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853703.74794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853703.74797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853703.74799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853703.74831: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853703.74839: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853703.74876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853703.74958: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853703.74973: Set connection var ansible_timeout to 10 30583 1726853703.74980: Set connection var ansible_connection to ssh 30583 1726853703.74989: Set connection var ansible_shell_executable to /bin/sh 30583 1726853703.74995: Set connection var ansible_shell_type to sh 30583 1726853703.75020: Set connection var ansible_pipelining to False 30583 1726853703.75113: variable 'ansible_shell_executable' from source: unknown 30583 1726853703.75116: variable 'ansible_connection' from source: unknown 30583 1726853703.75119: variable 'ansible_module_compression' from source: unknown 30583 1726853703.75121: variable 'ansible_shell_type' from source: unknown 30583 1726853703.75127: variable 'ansible_shell_executable' from source: unknown 30583 1726853703.75129: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853703.75131: variable 'ansible_pipelining' from source: unknown 30583 1726853703.75133: variable 'ansible_timeout' from source: unknown 30583 1726853703.75135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853703.75244: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853703.75260: variable 'omit' from source: magic vars 30583 1726853703.75279: starting attempt loop 30583 1726853703.75283: running the handler 30583 1726853703.75382: variable '__network_connections_result' from source: set_fact 30583 1726853703.75423: handler run complete 30583 1726853703.75440: attempt loop complete, returning result 30583 1726853703.75443: _execute() done 30583 1726853703.75446: dumping result to json 30583 1726853703.75450: done dumping result, returning 30583 1726853703.75453: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-000000000b45] 30583 1726853703.75460: sending task result for task 02083763-bbaf-05ea-abc5-000000000b45 30583 1726853703.75547: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b45 30583 1726853703.75549: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72" ] } 30583 1726853703.75630: no more pending results, returning what we have 30583 1726853703.75634: results queue empty 30583 1726853703.75635: checking for any_errors_fatal 30583 1726853703.75640: done checking for any_errors_fatal 30583 1726853703.75641: checking for max_fail_percentage 30583 1726853703.75642: done checking for max_fail_percentage 30583 1726853703.75643: checking to see if all hosts have failed and the running result is not ok 30583 1726853703.75644: done checking to see if all hosts have failed 30583 1726853703.75644: getting the remaining hosts for this loop 30583 1726853703.75646: done getting the remaining hosts for this loop 30583 1726853703.75650: getting the next task for host managed_node2 30583 1726853703.75660: done getting next task for host managed_node2 30583 1726853703.75664: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853703.75668: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853703.75683: getting variables 30583 1726853703.75684: in VariableManager get_vars() 30583 1726853703.75717: Calling all_inventory to load vars for managed_node2 30583 1726853703.75720: Calling groups_inventory to load vars for managed_node2 30583 1726853703.75722: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853703.75730: Calling all_plugins_play to load vars for managed_node2 30583 1726853703.75732: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853703.75734: Calling groups_plugins_play to load vars for managed_node2 30583 1726853703.76638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853703.77643: done with get_vars() 30583 1726853703.77660: done getting variables 30583 1726853703.77702: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:35:03 -0400 (0:00:00.046) 0:00:39.114 ****** 30583 1726853703.77733: entering _queue_task() for managed_node2/debug 30583 1726853703.77974: worker is 1 (out of 1 available) 30583 1726853703.77988: exiting _queue_task() for managed_node2/debug 30583 1726853703.78000: done queuing things up, now waiting for results queue to drain 30583 1726853703.78002: waiting for pending results... 30583 1726853703.78296: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853703.78323: in run() - task 02083763-bbaf-05ea-abc5-000000000b46 30583 1726853703.78335: variable 'ansible_search_path' from source: unknown 30583 1726853703.78339: variable 'ansible_search_path' from source: unknown 30583 1726853703.78373: calling self._execute() 30583 1726853703.78464: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853703.78468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853703.78480: variable 'omit' from source: magic vars 30583 1726853703.78826: variable 'ansible_distribution_major_version' from source: facts 30583 1726853703.78833: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853703.78839: variable 'omit' from source: magic vars 30583 1726853703.78879: variable 'omit' from source: magic vars 30583 1726853703.78901: variable 'omit' from source: magic vars 30583 1726853703.78934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853703.78962: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853703.78981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853703.78994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853703.79003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853703.79027: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853703.79030: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853703.79034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853703.79104: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853703.79108: Set connection var ansible_timeout to 10 30583 1726853703.79111: Set connection var ansible_connection to ssh 30583 1726853703.79116: Set connection var ansible_shell_executable to /bin/sh 30583 1726853703.79119: Set connection var ansible_shell_type to sh 30583 1726853703.79126: Set connection var ansible_pipelining to False 30583 1726853703.79145: variable 'ansible_shell_executable' from source: unknown 30583 1726853703.79148: variable 'ansible_connection' from source: unknown 30583 1726853703.79151: variable 'ansible_module_compression' from source: unknown 30583 1726853703.79157: variable 'ansible_shell_type' from source: unknown 30583 1726853703.79160: variable 'ansible_shell_executable' from source: unknown 30583 1726853703.79162: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853703.79164: variable 'ansible_pipelining' from source: unknown 30583 1726853703.79166: variable 'ansible_timeout' from source: unknown 30583 1726853703.79168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853703.79262: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853703.79269: variable 'omit' from source: magic vars 30583 1726853703.79276: starting attempt loop 30583 1726853703.79288: running the handler 30583 1726853703.79321: variable '__network_connections_result' from source: set_fact 30583 1726853703.79376: variable '__network_connections_result' from source: set_fact 30583 1726853703.79460: handler run complete 30583 1726853703.79478: attempt loop complete, returning result 30583 1726853703.79481: _execute() done 30583 1726853703.79484: dumping result to json 30583 1726853703.79486: done dumping result, returning 30583 1726853703.79495: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-000000000b46] 30583 1726853703.79499: sending task result for task 02083763-bbaf-05ea-abc5-000000000b46 30583 1726853703.79591: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b46 30583 1726853703.79593: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72" ] } } 30583 1726853703.79699: no more pending results, returning what we have 30583 1726853703.79702: results queue empty 30583 1726853703.79703: checking for any_errors_fatal 30583 1726853703.79708: done checking for any_errors_fatal 30583 1726853703.79709: checking for max_fail_percentage 30583 1726853703.79710: done checking for max_fail_percentage 30583 1726853703.79711: checking to see if all hosts have failed and the running result is not ok 30583 1726853703.79712: done checking to see if all hosts have failed 30583 1726853703.79712: getting the remaining hosts for this loop 30583 1726853703.79714: done getting the remaining hosts for this loop 30583 1726853703.79719: getting the next task for host managed_node2 30583 1726853703.79725: done getting next task for host managed_node2 30583 1726853703.79728: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853703.79732: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853703.79742: getting variables 30583 1726853703.79744: in VariableManager get_vars() 30583 1726853703.79787: Calling all_inventory to load vars for managed_node2 30583 1726853703.79790: Calling groups_inventory to load vars for managed_node2 30583 1726853703.79792: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853703.79800: Calling all_plugins_play to load vars for managed_node2 30583 1726853703.79802: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853703.79805: Calling groups_plugins_play to load vars for managed_node2 30583 1726853703.80565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853703.81534: done with get_vars() 30583 1726853703.81550: done getting variables 30583 1726853703.81598: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:35:03 -0400 (0:00:00.038) 0:00:39.153 ****** 30583 1726853703.81625: entering _queue_task() for managed_node2/debug 30583 1726853703.81886: worker is 1 (out of 1 available) 30583 1726853703.81900: exiting _queue_task() for managed_node2/debug 30583 1726853703.81912: done queuing things up, now waiting for results queue to drain 30583 1726853703.81913: waiting for pending results... 30583 1726853703.82104: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853703.82207: in run() - task 02083763-bbaf-05ea-abc5-000000000b47 30583 1726853703.82218: variable 'ansible_search_path' from source: unknown 30583 1726853703.82221: variable 'ansible_search_path' from source: unknown 30583 1726853703.82252: calling self._execute() 30583 1726853703.82326: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853703.82330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853703.82337: variable 'omit' from source: magic vars 30583 1726853703.82642: variable 'ansible_distribution_major_version' from source: facts 30583 1726853703.82652: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853703.82742: variable 'network_state' from source: role '' defaults 30583 1726853703.82751: Evaluated conditional (network_state != {}): False 30583 1726853703.82755: when evaluation is False, skipping this task 30583 1726853703.82757: _execute() done 30583 1726853703.82762: dumping result to json 30583 1726853703.82765: done dumping result, returning 30583 1726853703.82775: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-000000000b47] 30583 1726853703.82779: sending task result for task 02083763-bbaf-05ea-abc5-000000000b47 30583 1726853703.82862: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b47 30583 1726853703.82864: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853703.82946: no more pending results, returning what we have 30583 1726853703.82952: results queue empty 30583 1726853703.82953: checking for any_errors_fatal 30583 1726853703.82966: done checking for any_errors_fatal 30583 1726853703.82967: checking for max_fail_percentage 30583 1726853703.82969: done checking for max_fail_percentage 30583 1726853703.82970: checking to see if all hosts have failed and the running result is not ok 30583 1726853703.82977: done checking to see if all hosts have failed 30583 1726853703.82978: getting the remaining hosts for this loop 30583 1726853703.82980: done getting the remaining hosts for this loop 30583 1726853703.82984: getting the next task for host managed_node2 30583 1726853703.82993: done getting next task for host managed_node2 30583 1726853703.82997: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853703.83002: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853703.83022: getting variables 30583 1726853703.83023: in VariableManager get_vars() 30583 1726853703.83056: Calling all_inventory to load vars for managed_node2 30583 1726853703.83058: Calling groups_inventory to load vars for managed_node2 30583 1726853703.83060: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853703.83069: Calling all_plugins_play to load vars for managed_node2 30583 1726853703.83072: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853703.83075: Calling groups_plugins_play to load vars for managed_node2 30583 1726853703.83858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853703.85139: done with get_vars() 30583 1726853703.85160: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:35:03 -0400 (0:00:00.036) 0:00:39.189 ****** 30583 1726853703.85249: entering _queue_task() for managed_node2/ping 30583 1726853703.85645: worker is 1 (out of 1 available) 30583 1726853703.85659: exiting _queue_task() for managed_node2/ping 30583 1726853703.85673: done queuing things up, now waiting for results queue to drain 30583 1726853703.85675: waiting for pending results... 30583 1726853703.86416: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853703.86483: in run() - task 02083763-bbaf-05ea-abc5-000000000b48 30583 1726853703.86487: variable 'ansible_search_path' from source: unknown 30583 1726853703.86490: variable 'ansible_search_path' from source: unknown 30583 1726853703.86626: calling self._execute() 30583 1726853703.86672: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853703.86676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853703.86703: variable 'omit' from source: magic vars 30583 1726853703.87063: variable 'ansible_distribution_major_version' from source: facts 30583 1726853703.87074: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853703.87082: variable 'omit' from source: magic vars 30583 1726853703.87148: variable 'omit' from source: magic vars 30583 1726853703.87241: variable 'omit' from source: magic vars 30583 1726853703.87245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853703.87247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853703.87262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853703.87281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853703.87291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853703.87319: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853703.87322: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853703.87324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853703.87430: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853703.87434: Set connection var ansible_timeout to 10 30583 1726853703.87436: Set connection var ansible_connection to ssh 30583 1726853703.87439: Set connection var ansible_shell_executable to /bin/sh 30583 1726853703.87441: Set connection var ansible_shell_type to sh 30583 1726853703.87516: Set connection var ansible_pipelining to False 30583 1726853703.87519: variable 'ansible_shell_executable' from source: unknown 30583 1726853703.87522: variable 'ansible_connection' from source: unknown 30583 1726853703.87525: variable 'ansible_module_compression' from source: unknown 30583 1726853703.87527: variable 'ansible_shell_type' from source: unknown 30583 1726853703.87529: variable 'ansible_shell_executable' from source: unknown 30583 1726853703.87531: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853703.87532: variable 'ansible_pipelining' from source: unknown 30583 1726853703.87534: variable 'ansible_timeout' from source: unknown 30583 1726853703.87536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853703.87917: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853703.87921: variable 'omit' from source: magic vars 30583 1726853703.87924: starting attempt loop 30583 1726853703.87926: running the handler 30583 1726853703.87928: _low_level_execute_command(): starting 30583 1726853703.87930: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853703.88526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.88550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.88568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.88612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853703.88616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853703.88618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853703.88697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853703.90426: stdout chunk (state=3): >>>/root <<< 30583 1726853703.90525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853703.90557: stderr chunk (state=3): >>><<< 30583 1726853703.90560: stdout chunk (state=3): >>><<< 30583 1726853703.90579: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853703.90590: _low_level_execute_command(): starting 30583 1726853703.90596: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327 `" && echo ansible-tmp-1726853703.9057915-32429-14668499400327="` echo /root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327 `" ) && sleep 0' 30583 1726853703.91021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.91024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.91027: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.91037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.91077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853703.91087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853703.91157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853703.93160: stdout chunk (state=3): >>>ansible-tmp-1726853703.9057915-32429-14668499400327=/root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327 <<< 30583 1726853703.93319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853703.93322: stdout chunk (state=3): >>><<< 30583 1726853703.93324: stderr chunk (state=3): >>><<< 30583 1726853703.93476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853703.9057915-32429-14668499400327=/root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853703.93479: variable 'ansible_module_compression' from source: unknown 30583 1726853703.93482: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853703.93484: variable 'ansible_facts' from source: unknown 30583 1726853703.93564: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327/AnsiballZ_ping.py 30583 1726853703.93729: Sending initial data 30583 1726853703.93739: Sent initial data (152 bytes) 30583 1726853703.94162: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.94168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.94190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853703.94193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.94242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853703.94246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853703.94323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853703.96077: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853703.96142: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853703.96242: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpkuwlm5a4 /root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327/AnsiballZ_ping.py <<< 30583 1726853703.96250: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327/AnsiballZ_ping.py" <<< 30583 1726853703.96313: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 30583 1726853703.96324: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpkuwlm5a4" to remote "/root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327/AnsiballZ_ping.py" <<< 30583 1726853703.96961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853703.96969: stderr chunk (state=3): >>><<< 30583 1726853703.96973: stdout chunk (state=3): >>><<< 30583 1726853703.97012: done transferring module to remote 30583 1726853703.97021: _low_level_execute_command(): starting 30583 1726853703.97026: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327/ /root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327/AnsiballZ_ping.py && sleep 0' 30583 1726853703.97433: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.97436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.97439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.97441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.97488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853703.97497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853703.97566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853703.99419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853703.99445: stderr chunk (state=3): >>><<< 30583 1726853703.99448: stdout chunk (state=3): >>><<< 30583 1726853703.99477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853703.99482: _low_level_execute_command(): starting 30583 1726853703.99484: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327/AnsiballZ_ping.py && sleep 0' 30583 1726853703.99884: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853703.99888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.99890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853703.99892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853703.99894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853703.99940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853703.99943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853704.00019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853704.15751: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853704.17160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853704.17165: stdout chunk (state=3): >>><<< 30583 1726853704.17167: stderr chunk (state=3): >>><<< 30583 1726853704.17207: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853704.17218: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853704.17244: _low_level_execute_command(): starting 30583 1726853704.17259: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853703.9057915-32429-14668499400327/ > /dev/null 2>&1 && sleep 0' 30583 1726853704.17824: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853704.17827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853704.17844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853704.17847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853704.17849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853704.17899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853704.17905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853704.17907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853704.17984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853704.19900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853704.19931: stderr chunk (state=3): >>><<< 30583 1726853704.19934: stdout chunk (state=3): >>><<< 30583 1726853704.19947: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853704.19955: handler run complete 30583 1726853704.19968: attempt loop complete, returning result 30583 1726853704.19972: _execute() done 30583 1726853704.19975: dumping result to json 30583 1726853704.19977: done dumping result, returning 30583 1726853704.19986: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-000000000b48] 30583 1726853704.19989: sending task result for task 02083763-bbaf-05ea-abc5-000000000b48 30583 1726853704.20074: done sending task result for task 02083763-bbaf-05ea-abc5-000000000b48 30583 1726853704.20076: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853704.20166: no more pending results, returning what we have 30583 1726853704.20169: results queue empty 30583 1726853704.20173: checking for any_errors_fatal 30583 1726853704.20178: done checking for any_errors_fatal 30583 1726853704.20179: checking for max_fail_percentage 30583 1726853704.20181: done checking for max_fail_percentage 30583 1726853704.20182: checking to see if all hosts have failed and the running result is not ok 30583 1726853704.20183: done checking to see if all hosts have failed 30583 1726853704.20183: getting the remaining hosts for this loop 30583 1726853704.20185: done getting the remaining hosts for this loop 30583 1726853704.20190: getting the next task for host managed_node2 30583 1726853704.20200: done getting next task for host managed_node2 30583 1726853704.20203: ^ task is: TASK: meta (role_complete) 30583 1726853704.20209: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853704.20221: getting variables 30583 1726853704.20223: in VariableManager get_vars() 30583 1726853704.20267: Calling all_inventory to load vars for managed_node2 30583 1726853704.20270: Calling groups_inventory to load vars for managed_node2 30583 1726853704.20293: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853704.20303: Calling all_plugins_play to load vars for managed_node2 30583 1726853704.20306: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853704.20308: Calling groups_plugins_play to load vars for managed_node2 30583 1726853704.25537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853704.26788: done with get_vars() 30583 1726853704.26807: done getting variables 30583 1726853704.26860: done queuing things up, now waiting for results queue to drain 30583 1726853704.26862: results queue empty 30583 1726853704.26862: checking for any_errors_fatal 30583 1726853704.26864: done checking for any_errors_fatal 30583 1726853704.26865: checking for max_fail_percentage 30583 1726853704.26865: done checking for max_fail_percentage 30583 1726853704.26866: checking to see if all hosts have failed and the running result is not ok 30583 1726853704.26866: done checking to see if all hosts have failed 30583 1726853704.26867: getting the remaining hosts for this loop 30583 1726853704.26868: done getting the remaining hosts for this loop 30583 1726853704.26870: getting the next task for host managed_node2 30583 1726853704.26875: done getting next task for host managed_node2 30583 1726853704.26876: ^ task is: TASK: Show result 30583 1726853704.26878: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853704.26879: getting variables 30583 1726853704.26880: in VariableManager get_vars() 30583 1726853704.26889: Calling all_inventory to load vars for managed_node2 30583 1726853704.26891: Calling groups_inventory to load vars for managed_node2 30583 1726853704.26894: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853704.26899: Calling all_plugins_play to load vars for managed_node2 30583 1726853704.26901: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853704.26904: Calling groups_plugins_play to load vars for managed_node2 30583 1726853704.27797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853704.29343: done with get_vars() 30583 1726853704.29368: done getting variables 30583 1726853704.29415: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 13:35:04 -0400 (0:00:00.441) 0:00:39.631 ****** 30583 1726853704.29445: entering _queue_task() for managed_node2/debug 30583 1726853704.29823: worker is 1 (out of 1 available) 30583 1726853704.29836: exiting _queue_task() for managed_node2/debug 30583 1726853704.29849: done queuing things up, now waiting for results queue to drain 30583 1726853704.29851: waiting for pending results... 30583 1726853704.30199: running TaskExecutor() for managed_node2/TASK: Show result 30583 1726853704.30577: in run() - task 02083763-bbaf-05ea-abc5-000000000ad2 30583 1726853704.30582: variable 'ansible_search_path' from source: unknown 30583 1726853704.30586: variable 'ansible_search_path' from source: unknown 30583 1726853704.30589: calling self._execute() 30583 1726853704.30724: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853704.30740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853704.30840: variable 'omit' from source: magic vars 30583 1726853704.31192: variable 'ansible_distribution_major_version' from source: facts 30583 1726853704.31212: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853704.31225: variable 'omit' from source: magic vars 30583 1726853704.31289: variable 'omit' from source: magic vars 30583 1726853704.31332: variable 'omit' from source: magic vars 30583 1726853704.31386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853704.31429: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853704.31456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853704.31482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853704.31508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853704.31545: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853704.31555: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853704.31565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853704.31713: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853704.31716: Set connection var ansible_timeout to 10 30583 1726853704.31719: Set connection var ansible_connection to ssh 30583 1726853704.31721: Set connection var ansible_shell_executable to /bin/sh 30583 1726853704.31724: Set connection var ansible_shell_type to sh 30583 1726853704.31727: Set connection var ansible_pipelining to False 30583 1726853704.31755: variable 'ansible_shell_executable' from source: unknown 30583 1726853704.31765: variable 'ansible_connection' from source: unknown 30583 1726853704.31777: variable 'ansible_module_compression' from source: unknown 30583 1726853704.31822: variable 'ansible_shell_type' from source: unknown 30583 1726853704.31825: variable 'ansible_shell_executable' from source: unknown 30583 1726853704.31828: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853704.31830: variable 'ansible_pipelining' from source: unknown 30583 1726853704.31833: variable 'ansible_timeout' from source: unknown 30583 1726853704.31835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853704.31966: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853704.31989: variable 'omit' from source: magic vars 30583 1726853704.32040: starting attempt loop 30583 1726853704.32044: running the handler 30583 1726853704.32066: variable '__network_connections_result' from source: set_fact 30583 1726853704.32194: variable '__network_connections_result' from source: set_fact 30583 1726853704.32577: handler run complete 30583 1726853704.32581: attempt loop complete, returning result 30583 1726853704.32583: _execute() done 30583 1726853704.32585: dumping result to json 30583 1726853704.32587: done dumping result, returning 30583 1726853704.32590: done running TaskExecutor() for managed_node2/TASK: Show result [02083763-bbaf-05ea-abc5-000000000ad2] 30583 1726853704.32592: sending task result for task 02083763-bbaf-05ea-abc5-000000000ad2 30583 1726853704.32660: done sending task result for task 02083763-bbaf-05ea-abc5-000000000ad2 30583 1726853704.32665: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72" ] } } 30583 1726853704.32761: no more pending results, returning what we have 30583 1726853704.32765: results queue empty 30583 1726853704.32766: checking for any_errors_fatal 30583 1726853704.32768: done checking for any_errors_fatal 30583 1726853704.32768: checking for max_fail_percentage 30583 1726853704.32776: done checking for max_fail_percentage 30583 1726853704.32778: checking to see if all hosts have failed and the running result is not ok 30583 1726853704.32779: done checking to see if all hosts have failed 30583 1726853704.32780: getting the remaining hosts for this loop 30583 1726853704.32782: done getting the remaining hosts for this loop 30583 1726853704.32790: getting the next task for host managed_node2 30583 1726853704.32801: done getting next task for host managed_node2 30583 1726853704.32805: ^ task is: TASK: Test 30583 1726853704.32808: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853704.32812: getting variables 30583 1726853704.32814: in VariableManager get_vars() 30583 1726853704.32845: Calling all_inventory to load vars for managed_node2 30583 1726853704.32848: Calling groups_inventory to load vars for managed_node2 30583 1726853704.32852: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853704.32862: Calling all_plugins_play to load vars for managed_node2 30583 1726853704.32867: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853704.32976: Calling groups_plugins_play to load vars for managed_node2 30583 1726853704.35072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853704.36438: done with get_vars() 30583 1726853704.36456: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 13:35:04 -0400 (0:00:00.070) 0:00:39.702 ****** 30583 1726853704.36543: entering _queue_task() for managed_node2/include_tasks 30583 1726853704.36907: worker is 1 (out of 1 available) 30583 1726853704.36921: exiting _queue_task() for managed_node2/include_tasks 30583 1726853704.36932: done queuing things up, now waiting for results queue to drain 30583 1726853704.36934: waiting for pending results... 30583 1726853704.37185: running TaskExecutor() for managed_node2/TASK: Test 30583 1726853704.37270: in run() - task 02083763-bbaf-05ea-abc5-000000000a4d 30583 1726853704.37282: variable 'ansible_search_path' from source: unknown 30583 1726853704.37285: variable 'ansible_search_path' from source: unknown 30583 1726853704.37318: variable 'lsr_test' from source: include params 30583 1726853704.37486: variable 'lsr_test' from source: include params 30583 1726853704.37569: variable 'omit' from source: magic vars 30583 1726853704.37695: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853704.37706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853704.37718: variable 'omit' from source: magic vars 30583 1726853704.37943: variable 'ansible_distribution_major_version' from source: facts 30583 1726853704.37950: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853704.37959: variable 'item' from source: unknown 30583 1726853704.38026: variable 'item' from source: unknown 30583 1726853704.38072: variable 'item' from source: unknown 30583 1726853704.38133: variable 'item' from source: unknown 30583 1726853704.38318: dumping result to json 30583 1726853704.38322: done dumping result, returning 30583 1726853704.38324: done running TaskExecutor() for managed_node2/TASK: Test [02083763-bbaf-05ea-abc5-000000000a4d] 30583 1726853704.38326: sending task result for task 02083763-bbaf-05ea-abc5-000000000a4d 30583 1726853704.38535: done sending task result for task 02083763-bbaf-05ea-abc5-000000000a4d 30583 1726853704.38540: WORKER PROCESS EXITING 30583 1726853704.38566: no more pending results, returning what we have 30583 1726853704.38573: in VariableManager get_vars() 30583 1726853704.38609: Calling all_inventory to load vars for managed_node2 30583 1726853704.38615: Calling groups_inventory to load vars for managed_node2 30583 1726853704.38619: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853704.38629: Calling all_plugins_play to load vars for managed_node2 30583 1726853704.38632: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853704.38638: Calling groups_plugins_play to load vars for managed_node2 30583 1726853704.39950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853704.41393: done with get_vars() 30583 1726853704.41412: variable 'ansible_search_path' from source: unknown 30583 1726853704.41416: variable 'ansible_search_path' from source: unknown 30583 1726853704.41462: we have included files to process 30583 1726853704.41463: generating all_blocks data 30583 1726853704.41465: done generating all_blocks data 30583 1726853704.41473: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30583 1726853704.41474: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30583 1726853704.41480: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30583 1726853704.41622: done processing included file 30583 1726853704.41624: iterating over new_blocks loaded from include file 30583 1726853704.41626: in VariableManager get_vars() 30583 1726853704.41646: done with get_vars() 30583 1726853704.41648: filtering new block on tags 30583 1726853704.41683: done filtering new block on tags 30583 1726853704.41685: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node2 => (item=tasks/activate_profile.yml) 30583 1726853704.41690: extending task lists for all hosts with included blocks 30583 1726853704.42432: done extending task lists 30583 1726853704.42433: done processing included files 30583 1726853704.42433: results queue empty 30583 1726853704.42434: checking for any_errors_fatal 30583 1726853704.42437: done checking for any_errors_fatal 30583 1726853704.42437: checking for max_fail_percentage 30583 1726853704.42438: done checking for max_fail_percentage 30583 1726853704.42439: checking to see if all hosts have failed and the running result is not ok 30583 1726853704.42439: done checking to see if all hosts have failed 30583 1726853704.42439: getting the remaining hosts for this loop 30583 1726853704.42440: done getting the remaining hosts for this loop 30583 1726853704.42442: getting the next task for host managed_node2 30583 1726853704.42445: done getting next task for host managed_node2 30583 1726853704.42446: ^ task is: TASK: Include network role 30583 1726853704.42448: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853704.42450: getting variables 30583 1726853704.42450: in VariableManager get_vars() 30583 1726853704.42459: Calling all_inventory to load vars for managed_node2 30583 1726853704.42461: Calling groups_inventory to load vars for managed_node2 30583 1726853704.42462: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853704.42466: Calling all_plugins_play to load vars for managed_node2 30583 1726853704.42467: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853704.42469: Calling groups_plugins_play to load vars for managed_node2 30583 1726853704.43190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853704.44030: done with get_vars() 30583 1726853704.44044: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 13:35:04 -0400 (0:00:00.075) 0:00:39.778 ****** 30583 1726853704.44112: entering _queue_task() for managed_node2/include_role 30583 1726853704.44367: worker is 1 (out of 1 available) 30583 1726853704.44381: exiting _queue_task() for managed_node2/include_role 30583 1726853704.44394: done queuing things up, now waiting for results queue to drain 30583 1726853704.44395: waiting for pending results... 30583 1726853704.44578: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853704.44662: in run() - task 02083763-bbaf-05ea-abc5-000000000caa 30583 1726853704.44674: variable 'ansible_search_path' from source: unknown 30583 1726853704.44678: variable 'ansible_search_path' from source: unknown 30583 1726853704.44705: calling self._execute() 30583 1726853704.44781: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853704.44785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853704.44794: variable 'omit' from source: magic vars 30583 1726853704.45073: variable 'ansible_distribution_major_version' from source: facts 30583 1726853704.45081: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853704.45086: _execute() done 30583 1726853704.45090: dumping result to json 30583 1726853704.45092: done dumping result, returning 30583 1726853704.45099: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-000000000caa] 30583 1726853704.45102: sending task result for task 02083763-bbaf-05ea-abc5-000000000caa 30583 1726853704.45202: done sending task result for task 02083763-bbaf-05ea-abc5-000000000caa 30583 1726853704.45205: WORKER PROCESS EXITING 30583 1726853704.45233: no more pending results, returning what we have 30583 1726853704.45238: in VariableManager get_vars() 30583 1726853704.45278: Calling all_inventory to load vars for managed_node2 30583 1726853704.45281: Calling groups_inventory to load vars for managed_node2 30583 1726853704.45284: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853704.45296: Calling all_plugins_play to load vars for managed_node2 30583 1726853704.45298: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853704.45301: Calling groups_plugins_play to load vars for managed_node2 30583 1726853704.46144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853704.47004: done with get_vars() 30583 1726853704.47016: variable 'ansible_search_path' from source: unknown 30583 1726853704.47017: variable 'ansible_search_path' from source: unknown 30583 1726853704.47101: variable 'omit' from source: magic vars 30583 1726853704.47127: variable 'omit' from source: magic vars 30583 1726853704.47136: variable 'omit' from source: magic vars 30583 1726853704.47139: we have included files to process 30583 1726853704.47139: generating all_blocks data 30583 1726853704.47140: done generating all_blocks data 30583 1726853704.47141: processing included file: fedora.linux_system_roles.network 30583 1726853704.47154: in VariableManager get_vars() 30583 1726853704.47164: done with get_vars() 30583 1726853704.47185: in VariableManager get_vars() 30583 1726853704.47195: done with get_vars() 30583 1726853704.47222: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853704.47292: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853704.47339: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853704.47600: in VariableManager get_vars() 30583 1726853704.47613: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853704.48833: iterating over new_blocks loaded from include file 30583 1726853704.48835: in VariableManager get_vars() 30583 1726853704.48847: done with get_vars() 30583 1726853704.48848: filtering new block on tags 30583 1726853704.49005: done filtering new block on tags 30583 1726853704.49008: in VariableManager get_vars() 30583 1726853704.49016: done with get_vars() 30583 1726853704.49017: filtering new block on tags 30583 1726853704.49027: done filtering new block on tags 30583 1726853704.49029: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853704.49033: extending task lists for all hosts with included blocks 30583 1726853704.49099: done extending task lists 30583 1726853704.49100: done processing included files 30583 1726853704.49100: results queue empty 30583 1726853704.49101: checking for any_errors_fatal 30583 1726853704.49103: done checking for any_errors_fatal 30583 1726853704.49103: checking for max_fail_percentage 30583 1726853704.49104: done checking for max_fail_percentage 30583 1726853704.49105: checking to see if all hosts have failed and the running result is not ok 30583 1726853704.49105: done checking to see if all hosts have failed 30583 1726853704.49106: getting the remaining hosts for this loop 30583 1726853704.49106: done getting the remaining hosts for this loop 30583 1726853704.49108: getting the next task for host managed_node2 30583 1726853704.49111: done getting next task for host managed_node2 30583 1726853704.49113: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853704.49115: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853704.49122: getting variables 30583 1726853704.49123: in VariableManager get_vars() 30583 1726853704.49131: Calling all_inventory to load vars for managed_node2 30583 1726853704.49132: Calling groups_inventory to load vars for managed_node2 30583 1726853704.49133: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853704.49137: Calling all_plugins_play to load vars for managed_node2 30583 1726853704.49138: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853704.49140: Calling groups_plugins_play to load vars for managed_node2 30583 1726853704.49758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853704.50678: done with get_vars() 30583 1726853704.50695: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:35:04 -0400 (0:00:00.066) 0:00:39.844 ****** 30583 1726853704.50744: entering _queue_task() for managed_node2/include_tasks 30583 1726853704.51010: worker is 1 (out of 1 available) 30583 1726853704.51024: exiting _queue_task() for managed_node2/include_tasks 30583 1726853704.51037: done queuing things up, now waiting for results queue to drain 30583 1726853704.51038: waiting for pending results... 30583 1726853704.51227: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853704.51321: in run() - task 02083763-bbaf-05ea-abc5-000000000d16 30583 1726853704.51332: variable 'ansible_search_path' from source: unknown 30583 1726853704.51335: variable 'ansible_search_path' from source: unknown 30583 1726853704.51368: calling self._execute() 30583 1726853704.51439: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853704.51442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853704.51451: variable 'omit' from source: magic vars 30583 1726853704.51727: variable 'ansible_distribution_major_version' from source: facts 30583 1726853704.51737: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853704.51742: _execute() done 30583 1726853704.51745: dumping result to json 30583 1726853704.51747: done dumping result, returning 30583 1726853704.51755: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-000000000d16] 30583 1726853704.51761: sending task result for task 02083763-bbaf-05ea-abc5-000000000d16 30583 1726853704.51846: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d16 30583 1726853704.51848: WORKER PROCESS EXITING 30583 1726853704.51897: no more pending results, returning what we have 30583 1726853704.51903: in VariableManager get_vars() 30583 1726853704.51942: Calling all_inventory to load vars for managed_node2 30583 1726853704.51945: Calling groups_inventory to load vars for managed_node2 30583 1726853704.51948: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853704.51966: Calling all_plugins_play to load vars for managed_node2 30583 1726853704.51970: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853704.51974: Calling groups_plugins_play to load vars for managed_node2 30583 1726853704.52751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853704.53620: done with get_vars() 30583 1726853704.53636: variable 'ansible_search_path' from source: unknown 30583 1726853704.53637: variable 'ansible_search_path' from source: unknown 30583 1726853704.53665: we have included files to process 30583 1726853704.53666: generating all_blocks data 30583 1726853704.53667: done generating all_blocks data 30583 1726853704.53670: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853704.53673: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853704.53674: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853704.54052: done processing included file 30583 1726853704.54054: iterating over new_blocks loaded from include file 30583 1726853704.54055: in VariableManager get_vars() 30583 1726853704.54073: done with get_vars() 30583 1726853704.54075: filtering new block on tags 30583 1726853704.54094: done filtering new block on tags 30583 1726853704.54096: in VariableManager get_vars() 30583 1726853704.54109: done with get_vars() 30583 1726853704.54110: filtering new block on tags 30583 1726853704.54138: done filtering new block on tags 30583 1726853704.54140: in VariableManager get_vars() 30583 1726853704.54153: done with get_vars() 30583 1726853704.54154: filtering new block on tags 30583 1726853704.54180: done filtering new block on tags 30583 1726853704.54181: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853704.54185: extending task lists for all hosts with included blocks 30583 1726853704.55192: done extending task lists 30583 1726853704.55193: done processing included files 30583 1726853704.55194: results queue empty 30583 1726853704.55194: checking for any_errors_fatal 30583 1726853704.55197: done checking for any_errors_fatal 30583 1726853704.55197: checking for max_fail_percentage 30583 1726853704.55198: done checking for max_fail_percentage 30583 1726853704.55199: checking to see if all hosts have failed and the running result is not ok 30583 1726853704.55199: done checking to see if all hosts have failed 30583 1726853704.55200: getting the remaining hosts for this loop 30583 1726853704.55201: done getting the remaining hosts for this loop 30583 1726853704.55202: getting the next task for host managed_node2 30583 1726853704.55206: done getting next task for host managed_node2 30583 1726853704.55209: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853704.55212: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853704.55219: getting variables 30583 1726853704.55220: in VariableManager get_vars() 30583 1726853704.55229: Calling all_inventory to load vars for managed_node2 30583 1726853704.55231: Calling groups_inventory to load vars for managed_node2 30583 1726853704.55232: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853704.55236: Calling all_plugins_play to load vars for managed_node2 30583 1726853704.55237: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853704.55239: Calling groups_plugins_play to load vars for managed_node2 30583 1726853704.55908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853704.56765: done with get_vars() 30583 1726853704.56781: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:35:04 -0400 (0:00:00.060) 0:00:39.905 ****** 30583 1726853704.56836: entering _queue_task() for managed_node2/setup 30583 1726853704.57113: worker is 1 (out of 1 available) 30583 1726853704.57127: exiting _queue_task() for managed_node2/setup 30583 1726853704.57140: done queuing things up, now waiting for results queue to drain 30583 1726853704.57141: waiting for pending results... 30583 1726853704.57328: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853704.57430: in run() - task 02083763-bbaf-05ea-abc5-000000000d6d 30583 1726853704.57442: variable 'ansible_search_path' from source: unknown 30583 1726853704.57446: variable 'ansible_search_path' from source: unknown 30583 1726853704.57480: calling self._execute() 30583 1726853704.57547: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853704.57551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853704.57561: variable 'omit' from source: magic vars 30583 1726853704.57834: variable 'ansible_distribution_major_version' from source: facts 30583 1726853704.57844: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853704.57995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853704.59483: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853704.59566: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853704.59677: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853704.59695: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853704.59751: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853704.59895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853704.59925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853704.59943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853704.59970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853704.59983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853704.60026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853704.60039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853704.60058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853704.60088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853704.60100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853704.60208: variable '__network_required_facts' from source: role '' defaults 30583 1726853704.60215: variable 'ansible_facts' from source: unknown 30583 1726853704.60848: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853704.60851: when evaluation is False, skipping this task 30583 1726853704.60854: _execute() done 30583 1726853704.60859: dumping result to json 30583 1726853704.60861: done dumping result, returning 30583 1726853704.60864: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-000000000d6d] 30583 1726853704.60865: sending task result for task 02083763-bbaf-05ea-abc5-000000000d6d 30583 1726853704.60935: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d6d 30583 1726853704.60937: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853704.61043: no more pending results, returning what we have 30583 1726853704.61048: results queue empty 30583 1726853704.61049: checking for any_errors_fatal 30583 1726853704.61052: done checking for any_errors_fatal 30583 1726853704.61052: checking for max_fail_percentage 30583 1726853704.61054: done checking for max_fail_percentage 30583 1726853704.61056: checking to see if all hosts have failed and the running result is not ok 30583 1726853704.61057: done checking to see if all hosts have failed 30583 1726853704.61058: getting the remaining hosts for this loop 30583 1726853704.61060: done getting the remaining hosts for this loop 30583 1726853704.61064: getting the next task for host managed_node2 30583 1726853704.61078: done getting next task for host managed_node2 30583 1726853704.61083: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853704.61090: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853704.61115: getting variables 30583 1726853704.61117: in VariableManager get_vars() 30583 1726853704.61158: Calling all_inventory to load vars for managed_node2 30583 1726853704.61160: Calling groups_inventory to load vars for managed_node2 30583 1726853704.61164: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853704.61409: Calling all_plugins_play to load vars for managed_node2 30583 1726853704.61420: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853704.61436: Calling groups_plugins_play to load vars for managed_node2 30583 1726853704.62487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853704.63643: done with get_vars() 30583 1726853704.63660: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:35:04 -0400 (0:00:00.068) 0:00:39.974 ****** 30583 1726853704.63733: entering _queue_task() for managed_node2/stat 30583 1726853704.64026: worker is 1 (out of 1 available) 30583 1726853704.64040: exiting _queue_task() for managed_node2/stat 30583 1726853704.64052: done queuing things up, now waiting for results queue to drain 30583 1726853704.64053: waiting for pending results... 30583 1726853704.64601: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853704.64606: in run() - task 02083763-bbaf-05ea-abc5-000000000d6f 30583 1726853704.64609: variable 'ansible_search_path' from source: unknown 30583 1726853704.64611: variable 'ansible_search_path' from source: unknown 30583 1726853704.64613: calling self._execute() 30583 1726853704.64677: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853704.64691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853704.64708: variable 'omit' from source: magic vars 30583 1726853704.65084: variable 'ansible_distribution_major_version' from source: facts 30583 1726853704.65104: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853704.65275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853704.65567: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853704.65675: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853704.65678: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853704.65681: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853704.65765: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853704.65802: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853704.65834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853704.65869: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853704.65962: variable '__network_is_ostree' from source: set_fact 30583 1726853704.65977: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853704.65984: when evaluation is False, skipping this task 30583 1726853704.65992: _execute() done 30583 1726853704.66003: dumping result to json 30583 1726853704.66011: done dumping result, returning 30583 1726853704.66022: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-000000000d6f] 30583 1726853704.66075: sending task result for task 02083763-bbaf-05ea-abc5-000000000d6f skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853704.66322: no more pending results, returning what we have 30583 1726853704.66330: results queue empty 30583 1726853704.66332: checking for any_errors_fatal 30583 1726853704.66340: done checking for any_errors_fatal 30583 1726853704.66340: checking for max_fail_percentage 30583 1726853704.66342: done checking for max_fail_percentage 30583 1726853704.66343: checking to see if all hosts have failed and the running result is not ok 30583 1726853704.66344: done checking to see if all hosts have failed 30583 1726853704.66344: getting the remaining hosts for this loop 30583 1726853704.66346: done getting the remaining hosts for this loop 30583 1726853704.66350: getting the next task for host managed_node2 30583 1726853704.66360: done getting next task for host managed_node2 30583 1726853704.66364: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853704.66369: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853704.66379: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d6f 30583 1726853704.66382: WORKER PROCESS EXITING 30583 1726853704.66393: getting variables 30583 1726853704.66395: in VariableManager get_vars() 30583 1726853704.66427: Calling all_inventory to load vars for managed_node2 30583 1726853704.66430: Calling groups_inventory to load vars for managed_node2 30583 1726853704.66431: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853704.66440: Calling all_plugins_play to load vars for managed_node2 30583 1726853704.66443: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853704.66445: Calling groups_plugins_play to load vars for managed_node2 30583 1726853704.67346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853704.68366: done with get_vars() 30583 1726853704.68389: done getting variables 30583 1726853704.68444: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:35:04 -0400 (0:00:00.047) 0:00:40.022 ****** 30583 1726853704.68486: entering _queue_task() for managed_node2/set_fact 30583 1726853704.68820: worker is 1 (out of 1 available) 30583 1726853704.68832: exiting _queue_task() for managed_node2/set_fact 30583 1726853704.68844: done queuing things up, now waiting for results queue to drain 30583 1726853704.68845: waiting for pending results... 30583 1726853704.69187: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853704.69337: in run() - task 02083763-bbaf-05ea-abc5-000000000d70 30583 1726853704.69361: variable 'ansible_search_path' from source: unknown 30583 1726853704.69370: variable 'ansible_search_path' from source: unknown 30583 1726853704.69411: calling self._execute() 30583 1726853704.69514: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853704.69527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853704.69547: variable 'omit' from source: magic vars 30583 1726853704.69978: variable 'ansible_distribution_major_version' from source: facts 30583 1726853704.69982: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853704.70136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853704.70425: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853704.70477: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853704.70515: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853704.70552: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853704.70629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853704.70650: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853704.70679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853704.70693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853704.70763: variable '__network_is_ostree' from source: set_fact 30583 1726853704.70769: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853704.70774: when evaluation is False, skipping this task 30583 1726853704.70776: _execute() done 30583 1726853704.70785: dumping result to json 30583 1726853704.70788: done dumping result, returning 30583 1726853704.70791: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-000000000d70] 30583 1726853704.70794: sending task result for task 02083763-bbaf-05ea-abc5-000000000d70 30583 1726853704.70880: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d70 30583 1726853704.70883: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853704.70933: no more pending results, returning what we have 30583 1726853704.70937: results queue empty 30583 1726853704.70938: checking for any_errors_fatal 30583 1726853704.70945: done checking for any_errors_fatal 30583 1726853704.70946: checking for max_fail_percentage 30583 1726853704.70948: done checking for max_fail_percentage 30583 1726853704.70949: checking to see if all hosts have failed and the running result is not ok 30583 1726853704.70950: done checking to see if all hosts have failed 30583 1726853704.70951: getting the remaining hosts for this loop 30583 1726853704.70953: done getting the remaining hosts for this loop 30583 1726853704.70957: getting the next task for host managed_node2 30583 1726853704.70968: done getting next task for host managed_node2 30583 1726853704.70973: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853704.70979: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853704.71006: getting variables 30583 1726853704.71008: in VariableManager get_vars() 30583 1726853704.71043: Calling all_inventory to load vars for managed_node2 30583 1726853704.71046: Calling groups_inventory to load vars for managed_node2 30583 1726853704.71047: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853704.71056: Calling all_plugins_play to load vars for managed_node2 30583 1726853704.71059: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853704.71061: Calling groups_plugins_play to load vars for managed_node2 30583 1726853704.71844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853704.73224: done with get_vars() 30583 1726853704.73244: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:35:04 -0400 (0:00:00.048) 0:00:40.070 ****** 30583 1726853704.73351: entering _queue_task() for managed_node2/service_facts 30583 1726853704.73633: worker is 1 (out of 1 available) 30583 1726853704.73647: exiting _queue_task() for managed_node2/service_facts 30583 1726853704.73662: done queuing things up, now waiting for results queue to drain 30583 1726853704.73664: waiting for pending results... 30583 1726853704.73858: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853704.73963: in run() - task 02083763-bbaf-05ea-abc5-000000000d72 30583 1726853704.73981: variable 'ansible_search_path' from source: unknown 30583 1726853704.74060: variable 'ansible_search_path' from source: unknown 30583 1726853704.74064: calling self._execute() 30583 1726853704.74176: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853704.74180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853704.74183: variable 'omit' from source: magic vars 30583 1726853704.74520: variable 'ansible_distribution_major_version' from source: facts 30583 1726853704.74536: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853704.74546: variable 'omit' from source: magic vars 30583 1726853704.74625: variable 'omit' from source: magic vars 30583 1726853704.74663: variable 'omit' from source: magic vars 30583 1726853704.74710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853704.74751: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853704.74910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853704.74923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853704.74926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853704.74929: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853704.74931: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853704.74933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853704.74951: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853704.74956: Set connection var ansible_timeout to 10 30583 1726853704.74962: Set connection var ansible_connection to ssh 30583 1726853704.74967: Set connection var ansible_shell_executable to /bin/sh 30583 1726853704.74970: Set connection var ansible_shell_type to sh 30583 1726853704.74980: Set connection var ansible_pipelining to False 30583 1726853704.74998: variable 'ansible_shell_executable' from source: unknown 30583 1726853704.75002: variable 'ansible_connection' from source: unknown 30583 1726853704.75005: variable 'ansible_module_compression' from source: unknown 30583 1726853704.75007: variable 'ansible_shell_type' from source: unknown 30583 1726853704.75010: variable 'ansible_shell_executable' from source: unknown 30583 1726853704.75012: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853704.75014: variable 'ansible_pipelining' from source: unknown 30583 1726853704.75016: variable 'ansible_timeout' from source: unknown 30583 1726853704.75029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853704.75186: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853704.75194: variable 'omit' from source: magic vars 30583 1726853704.75199: starting attempt loop 30583 1726853704.75202: running the handler 30583 1726853704.75213: _low_level_execute_command(): starting 30583 1726853704.75220: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853704.75874: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853704.75880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853704.75975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853704.76043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853704.77787: stdout chunk (state=3): >>>/root <<< 30583 1726853704.77878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853704.77910: stderr chunk (state=3): >>><<< 30583 1726853704.77912: stdout chunk (state=3): >>><<< 30583 1726853704.77925: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853704.77965: _low_level_execute_command(): starting 30583 1726853704.77969: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361 `" && echo ansible-tmp-1726853704.7792952-32461-102243790274361="` echo /root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361 `" ) && sleep 0' 30583 1726853704.78437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853704.78440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853704.78444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853704.78454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853704.78532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853704.78591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853704.80604: stdout chunk (state=3): >>>ansible-tmp-1726853704.7792952-32461-102243790274361=/root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361 <<< 30583 1726853704.80713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853704.80736: stderr chunk (state=3): >>><<< 30583 1726853704.80739: stdout chunk (state=3): >>><<< 30583 1726853704.80752: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853704.7792952-32461-102243790274361=/root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853704.80793: variable 'ansible_module_compression' from source: unknown 30583 1726853704.80832: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853704.80863: variable 'ansible_facts' from source: unknown 30583 1726853704.80921: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361/AnsiballZ_service_facts.py 30583 1726853704.81015: Sending initial data 30583 1726853704.81018: Sent initial data (162 bytes) 30583 1726853704.81438: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853704.81441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853704.81443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853704.81446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853704.81447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853704.81498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853704.81502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853704.81579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853704.83217: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30583 1726853704.83228: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853704.83285: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853704.83352: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpitryuv3m /root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361/AnsiballZ_service_facts.py <<< 30583 1726853704.83361: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361/AnsiballZ_service_facts.py" <<< 30583 1726853704.83419: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpitryuv3m" to remote "/root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361/AnsiballZ_service_facts.py" <<< 30583 1726853704.83424: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361/AnsiballZ_service_facts.py" <<< 30583 1726853704.84151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853704.84276: stderr chunk (state=3): >>><<< 30583 1726853704.84279: stdout chunk (state=3): >>><<< 30583 1726853704.84283: done transferring module to remote 30583 1726853704.84285: _low_level_execute_command(): starting 30583 1726853704.84287: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361/ /root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361/AnsiballZ_service_facts.py && sleep 0' 30583 1726853704.84850: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853704.84859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853704.84863: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853704.84883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853704.85009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853704.85070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853704.86961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853704.86964: stdout chunk (state=3): >>><<< 30583 1726853704.86985: stderr chunk (state=3): >>><<< 30583 1726853704.86992: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853704.87013: _low_level_execute_command(): starting 30583 1726853704.87018: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361/AnsiballZ_service_facts.py && sleep 0' 30583 1726853704.87654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853704.87681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853704.87695: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853704.87734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853704.87748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853704.87830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853706.52318: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30583 1726853706.52412: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 30583 1726853706.52525: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853706.54082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853706.54277: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853706.54281: stdout chunk (state=3): >>><<< 30583 1726853706.54283: stderr chunk (state=3): >>><<< 30583 1726853706.54289: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853706.55329: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853706.55400: _low_level_execute_command(): starting 30583 1726853706.55411: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853704.7792952-32461-102243790274361/ > /dev/null 2>&1 && sleep 0' 30583 1726853706.56492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853706.56584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853706.56646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853706.58640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853706.58704: stderr chunk (state=3): >>><<< 30583 1726853706.58741: stdout chunk (state=3): >>><<< 30583 1726853706.58980: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853706.58984: handler run complete 30583 1726853706.58995: variable 'ansible_facts' from source: unknown 30583 1726853706.59160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853706.59705: variable 'ansible_facts' from source: unknown 30583 1726853706.59933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853706.60265: attempt loop complete, returning result 30583 1726853706.60278: _execute() done 30583 1726853706.60286: dumping result to json 30583 1726853706.60381: done dumping result, returning 30583 1726853706.60401: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-000000000d72] 30583 1726853706.60415: sending task result for task 02083763-bbaf-05ea-abc5-000000000d72 30583 1726853706.61926: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d72 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853706.62004: no more pending results, returning what we have 30583 1726853706.62007: results queue empty 30583 1726853706.62008: checking for any_errors_fatal 30583 1726853706.62012: done checking for any_errors_fatal 30583 1726853706.62013: checking for max_fail_percentage 30583 1726853706.62015: done checking for max_fail_percentage 30583 1726853706.62016: checking to see if all hosts have failed and the running result is not ok 30583 1726853706.62017: done checking to see if all hosts have failed 30583 1726853706.62017: getting the remaining hosts for this loop 30583 1726853706.62019: done getting the remaining hosts for this loop 30583 1726853706.62022: getting the next task for host managed_node2 30583 1726853706.62074: done getting next task for host managed_node2 30583 1726853706.62078: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853706.62083: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853706.62092: WORKER PROCESS EXITING 30583 1726853706.62101: getting variables 30583 1726853706.62142: in VariableManager get_vars() 30583 1726853706.62223: Calling all_inventory to load vars for managed_node2 30583 1726853706.62226: Calling groups_inventory to load vars for managed_node2 30583 1726853706.62229: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853706.62239: Calling all_plugins_play to load vars for managed_node2 30583 1726853706.62242: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853706.62245: Calling groups_plugins_play to load vars for managed_node2 30583 1726853706.64551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853706.66251: done with get_vars() 30583 1726853706.66281: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:35:06 -0400 (0:00:01.930) 0:00:42.001 ****** 30583 1726853706.66393: entering _queue_task() for managed_node2/package_facts 30583 1726853706.66900: worker is 1 (out of 1 available) 30583 1726853706.66911: exiting _queue_task() for managed_node2/package_facts 30583 1726853706.66923: done queuing things up, now waiting for results queue to drain 30583 1726853706.66924: waiting for pending results... 30583 1726853706.67098: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853706.67373: in run() - task 02083763-bbaf-05ea-abc5-000000000d73 30583 1726853706.67377: variable 'ansible_search_path' from source: unknown 30583 1726853706.67380: variable 'ansible_search_path' from source: unknown 30583 1726853706.67383: calling self._execute() 30583 1726853706.67440: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853706.67452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853706.67465: variable 'omit' from source: magic vars 30583 1726853706.67852: variable 'ansible_distribution_major_version' from source: facts 30583 1726853706.67873: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853706.67886: variable 'omit' from source: magic vars 30583 1726853706.67969: variable 'omit' from source: magic vars 30583 1726853706.68009: variable 'omit' from source: magic vars 30583 1726853706.68060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853706.68103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853706.68134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853706.68158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853706.68240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853706.68243: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853706.68246: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853706.68248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853706.68333: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853706.68353: Set connection var ansible_timeout to 10 30583 1726853706.68362: Set connection var ansible_connection to ssh 30583 1726853706.68375: Set connection var ansible_shell_executable to /bin/sh 30583 1726853706.68383: Set connection var ansible_shell_type to sh 30583 1726853706.68399: Set connection var ansible_pipelining to False 30583 1726853706.68428: variable 'ansible_shell_executable' from source: unknown 30583 1726853706.68438: variable 'ansible_connection' from source: unknown 30583 1726853706.68446: variable 'ansible_module_compression' from source: unknown 30583 1726853706.68476: variable 'ansible_shell_type' from source: unknown 30583 1726853706.68479: variable 'ansible_shell_executable' from source: unknown 30583 1726853706.68481: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853706.68483: variable 'ansible_pipelining' from source: unknown 30583 1726853706.68486: variable 'ansible_timeout' from source: unknown 30583 1726853706.68488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853706.68877: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853706.68881: variable 'omit' from source: magic vars 30583 1726853706.68884: starting attempt loop 30583 1726853706.68887: running the handler 30583 1726853706.68890: _low_level_execute_command(): starting 30583 1726853706.68893: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853706.69575: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853706.69581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853706.69583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853706.69603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853706.69708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853706.71447: stdout chunk (state=3): >>>/root <<< 30583 1726853706.71610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853706.71614: stdout chunk (state=3): >>><<< 30583 1726853706.71617: stderr chunk (state=3): >>><<< 30583 1726853706.71638: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853706.71741: _low_level_execute_command(): starting 30583 1726853706.71746: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192 `" && echo ansible-tmp-1726853706.7164447-32544-131929707602192="` echo /root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192 `" ) && sleep 0' 30583 1726853706.72387: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853706.72416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853706.72434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853706.72539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853706.74607: stdout chunk (state=3): >>>ansible-tmp-1726853706.7164447-32544-131929707602192=/root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192 <<< 30583 1726853706.74753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853706.74778: stdout chunk (state=3): >>><<< 30583 1726853706.74977: stderr chunk (state=3): >>><<< 30583 1726853706.74980: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853706.7164447-32544-131929707602192=/root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853706.74983: variable 'ansible_module_compression' from source: unknown 30583 1726853706.74985: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853706.74987: variable 'ansible_facts' from source: unknown 30583 1726853706.75183: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192/AnsiballZ_package_facts.py 30583 1726853706.75339: Sending initial data 30583 1726853706.75348: Sent initial data (162 bytes) 30583 1726853706.75990: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853706.76002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853706.76092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853706.76125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853706.76142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853706.76161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853706.76261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853706.77926: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853706.78016: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853706.78083: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpws3m65hl /root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192/AnsiballZ_package_facts.py <<< 30583 1726853706.78105: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192/AnsiballZ_package_facts.py" <<< 30583 1726853706.78177: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpws3m65hl" to remote "/root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192/AnsiballZ_package_facts.py" <<< 30583 1726853706.79737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853706.79741: stderr chunk (state=3): >>><<< 30583 1726853706.79743: stdout chunk (state=3): >>><<< 30583 1726853706.79765: done transferring module to remote 30583 1726853706.79778: _low_level_execute_command(): starting 30583 1726853706.79787: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192/ /root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192/AnsiballZ_package_facts.py && sleep 0' 30583 1726853706.80404: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853706.80407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853706.80410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853706.80478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853706.80481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853706.80483: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853706.80486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853706.80488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853706.80490: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853706.80492: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853706.80494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853706.80496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853706.80512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853706.80517: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853706.80528: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853706.80533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853706.80604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853706.80645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853706.80714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853706.82778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853706.82782: stderr chunk (state=3): >>><<< 30583 1726853706.82784: stdout chunk (state=3): >>><<< 30583 1726853706.82786: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853706.82788: _low_level_execute_command(): starting 30583 1726853706.82791: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192/AnsiballZ_package_facts.py && sleep 0' 30583 1726853706.83420: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853706.83427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853706.83439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853706.83452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853706.83495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853706.83572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853706.83590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853706.83711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853707.28923: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "ar<<< 30583 1726853707.28944: stdout chunk (state=3): >>>ch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853707.30731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853707.30735: stdout chunk (state=3): >>><<< 30583 1726853707.30742: stderr chunk (state=3): >>><<< 30583 1726853707.30978: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853707.33454: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853707.33489: _low_level_execute_command(): starting 30583 1726853707.33499: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853706.7164447-32544-131929707602192/ > /dev/null 2>&1 && sleep 0' 30583 1726853707.34139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853707.34194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853707.34213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853707.34239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853707.34426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853707.36379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853707.36383: stderr chunk (state=3): >>><<< 30583 1726853707.36478: stdout chunk (state=3): >>><<< 30583 1726853707.36482: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853707.36484: handler run complete 30583 1726853707.37327: variable 'ansible_facts' from source: unknown 30583 1726853707.37618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853707.38987: variable 'ansible_facts' from source: unknown 30583 1726853707.39727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853707.40501: attempt loop complete, returning result 30583 1726853707.40515: _execute() done 30583 1726853707.40518: dumping result to json 30583 1726853707.40735: done dumping result, returning 30583 1726853707.40750: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-000000000d73] 30583 1726853707.40760: sending task result for task 02083763-bbaf-05ea-abc5-000000000d73 30583 1726853707.43147: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d73 30583 1726853707.43152: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853707.43311: no more pending results, returning what we have 30583 1726853707.43314: results queue empty 30583 1726853707.43314: checking for any_errors_fatal 30583 1726853707.43321: done checking for any_errors_fatal 30583 1726853707.43322: checking for max_fail_percentage 30583 1726853707.43324: done checking for max_fail_percentage 30583 1726853707.43325: checking to see if all hosts have failed and the running result is not ok 30583 1726853707.43325: done checking to see if all hosts have failed 30583 1726853707.43326: getting the remaining hosts for this loop 30583 1726853707.43327: done getting the remaining hosts for this loop 30583 1726853707.43330: getting the next task for host managed_node2 30583 1726853707.43338: done getting next task for host managed_node2 30583 1726853707.43342: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853707.43347: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853707.43359: getting variables 30583 1726853707.43361: in VariableManager get_vars() 30583 1726853707.43392: Calling all_inventory to load vars for managed_node2 30583 1726853707.43395: Calling groups_inventory to load vars for managed_node2 30583 1726853707.43397: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853707.43406: Calling all_plugins_play to load vars for managed_node2 30583 1726853707.43409: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853707.43412: Calling groups_plugins_play to load vars for managed_node2 30583 1726853707.44954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853707.45885: done with get_vars() 30583 1726853707.45902: done getting variables 30583 1726853707.45945: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:07 -0400 (0:00:00.795) 0:00:42.796 ****** 30583 1726853707.45977: entering _queue_task() for managed_node2/debug 30583 1726853707.46500: worker is 1 (out of 1 available) 30583 1726853707.46510: exiting _queue_task() for managed_node2/debug 30583 1726853707.46520: done queuing things up, now waiting for results queue to drain 30583 1726853707.46521: waiting for pending results... 30583 1726853707.46651: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853707.46859: in run() - task 02083763-bbaf-05ea-abc5-000000000d17 30583 1726853707.46863: variable 'ansible_search_path' from source: unknown 30583 1726853707.46866: variable 'ansible_search_path' from source: unknown 30583 1726853707.46872: calling self._execute() 30583 1726853707.46962: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853707.46980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853707.46996: variable 'omit' from source: magic vars 30583 1726853707.47419: variable 'ansible_distribution_major_version' from source: facts 30583 1726853707.47437: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853707.47575: variable 'omit' from source: magic vars 30583 1726853707.47578: variable 'omit' from source: magic vars 30583 1726853707.47673: variable 'network_provider' from source: set_fact 30583 1726853707.47677: variable 'omit' from source: magic vars 30583 1726853707.47705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853707.47763: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853707.47767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853707.47943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853707.47946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853707.47949: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853707.47951: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853707.47954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853707.47985: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853707.48009: Set connection var ansible_timeout to 10 30583 1726853707.48013: Set connection var ansible_connection to ssh 30583 1726853707.48015: Set connection var ansible_shell_executable to /bin/sh 30583 1726853707.48018: Set connection var ansible_shell_type to sh 30583 1726853707.48020: Set connection var ansible_pipelining to False 30583 1726853707.48022: variable 'ansible_shell_executable' from source: unknown 30583 1726853707.48025: variable 'ansible_connection' from source: unknown 30583 1726853707.48028: variable 'ansible_module_compression' from source: unknown 30583 1726853707.48031: variable 'ansible_shell_type' from source: unknown 30583 1726853707.48036: variable 'ansible_shell_executable' from source: unknown 30583 1726853707.48075: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853707.48077: variable 'ansible_pipelining' from source: unknown 30583 1726853707.48079: variable 'ansible_timeout' from source: unknown 30583 1726853707.48082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853707.48151: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853707.48157: variable 'omit' from source: magic vars 30583 1726853707.48162: starting attempt loop 30583 1726853707.48166: running the handler 30583 1726853707.48202: handler run complete 30583 1726853707.48214: attempt loop complete, returning result 30583 1726853707.48217: _execute() done 30583 1726853707.48220: dumping result to json 30583 1726853707.48223: done dumping result, returning 30583 1726853707.48229: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-000000000d17] 30583 1726853707.48232: sending task result for task 02083763-bbaf-05ea-abc5-000000000d17 30583 1726853707.48475: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d17 30583 1726853707.48528: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853707.48620: no more pending results, returning what we have 30583 1726853707.48624: results queue empty 30583 1726853707.48625: checking for any_errors_fatal 30583 1726853707.48632: done checking for any_errors_fatal 30583 1726853707.48633: checking for max_fail_percentage 30583 1726853707.48635: done checking for max_fail_percentage 30583 1726853707.48637: checking to see if all hosts have failed and the running result is not ok 30583 1726853707.48638: done checking to see if all hosts have failed 30583 1726853707.48639: getting the remaining hosts for this loop 30583 1726853707.48641: done getting the remaining hosts for this loop 30583 1726853707.48644: getting the next task for host managed_node2 30583 1726853707.48653: done getting next task for host managed_node2 30583 1726853707.48659: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853707.48664: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853707.48679: getting variables 30583 1726853707.48681: in VariableManager get_vars() 30583 1726853707.48717: Calling all_inventory to load vars for managed_node2 30583 1726853707.48720: Calling groups_inventory to load vars for managed_node2 30583 1726853707.48723: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853707.48730: Calling all_plugins_play to load vars for managed_node2 30583 1726853707.48732: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853707.48734: Calling groups_plugins_play to load vars for managed_node2 30583 1726853707.50061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853707.51555: done with get_vars() 30583 1726853707.51583: done getting variables 30583 1726853707.51651: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:07 -0400 (0:00:00.057) 0:00:42.854 ****** 30583 1726853707.51697: entering _queue_task() for managed_node2/fail 30583 1726853707.52087: worker is 1 (out of 1 available) 30583 1726853707.52100: exiting _queue_task() for managed_node2/fail 30583 1726853707.52112: done queuing things up, now waiting for results queue to drain 30583 1726853707.52114: waiting for pending results... 30583 1726853707.52446: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853707.52582: in run() - task 02083763-bbaf-05ea-abc5-000000000d18 30583 1726853707.52598: variable 'ansible_search_path' from source: unknown 30583 1726853707.52602: variable 'ansible_search_path' from source: unknown 30583 1726853707.52628: calling self._execute() 30583 1726853707.52716: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853707.52719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853707.52729: variable 'omit' from source: magic vars 30583 1726853707.53142: variable 'ansible_distribution_major_version' from source: facts 30583 1726853707.53145: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853707.53259: variable 'network_state' from source: role '' defaults 30583 1726853707.53263: Evaluated conditional (network_state != {}): False 30583 1726853707.53266: when evaluation is False, skipping this task 30583 1726853707.53275: _execute() done 30583 1726853707.53278: dumping result to json 30583 1726853707.53281: done dumping result, returning 30583 1726853707.53284: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-000000000d18] 30583 1726853707.53288: sending task result for task 02083763-bbaf-05ea-abc5-000000000d18 30583 1726853707.53397: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d18 30583 1726853707.53400: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853707.53472: no more pending results, returning what we have 30583 1726853707.53476: results queue empty 30583 1726853707.53477: checking for any_errors_fatal 30583 1726853707.53482: done checking for any_errors_fatal 30583 1726853707.53482: checking for max_fail_percentage 30583 1726853707.53484: done checking for max_fail_percentage 30583 1726853707.53485: checking to see if all hosts have failed and the running result is not ok 30583 1726853707.53490: done checking to see if all hosts have failed 30583 1726853707.53491: getting the remaining hosts for this loop 30583 1726853707.53493: done getting the remaining hosts for this loop 30583 1726853707.53496: getting the next task for host managed_node2 30583 1726853707.53504: done getting next task for host managed_node2 30583 1726853707.53507: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853707.53512: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853707.53534: getting variables 30583 1726853707.53536: in VariableManager get_vars() 30583 1726853707.53568: Calling all_inventory to load vars for managed_node2 30583 1726853707.53570: Calling groups_inventory to load vars for managed_node2 30583 1726853707.53606: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853707.53622: Calling all_plugins_play to load vars for managed_node2 30583 1726853707.53626: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853707.53630: Calling groups_plugins_play to load vars for managed_node2 30583 1726853707.54655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853707.60491: done with get_vars() 30583 1726853707.60510: done getting variables 30583 1726853707.60543: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:07 -0400 (0:00:00.088) 0:00:42.942 ****** 30583 1726853707.60567: entering _queue_task() for managed_node2/fail 30583 1726853707.60902: worker is 1 (out of 1 available) 30583 1726853707.60916: exiting _queue_task() for managed_node2/fail 30583 1726853707.60928: done queuing things up, now waiting for results queue to drain 30583 1726853707.60930: waiting for pending results... 30583 1726853707.61193: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853707.61328: in run() - task 02083763-bbaf-05ea-abc5-000000000d19 30583 1726853707.61368: variable 'ansible_search_path' from source: unknown 30583 1726853707.61376: variable 'ansible_search_path' from source: unknown 30583 1726853707.61410: calling self._execute() 30583 1726853707.61549: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853707.61554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853707.61604: variable 'omit' from source: magic vars 30583 1726853707.62041: variable 'ansible_distribution_major_version' from source: facts 30583 1726853707.62060: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853707.62177: variable 'network_state' from source: role '' defaults 30583 1726853707.62183: Evaluated conditional (network_state != {}): False 30583 1726853707.62189: when evaluation is False, skipping this task 30583 1726853707.62192: _execute() done 30583 1726853707.62196: dumping result to json 30583 1726853707.62199: done dumping result, returning 30583 1726853707.62209: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-000000000d19] 30583 1726853707.62213: sending task result for task 02083763-bbaf-05ea-abc5-000000000d19 30583 1726853707.62311: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d19 30583 1726853707.62314: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853707.62375: no more pending results, returning what we have 30583 1726853707.62379: results queue empty 30583 1726853707.62380: checking for any_errors_fatal 30583 1726853707.62392: done checking for any_errors_fatal 30583 1726853707.62393: checking for max_fail_percentage 30583 1726853707.62395: done checking for max_fail_percentage 30583 1726853707.62396: checking to see if all hosts have failed and the running result is not ok 30583 1726853707.62397: done checking to see if all hosts have failed 30583 1726853707.62398: getting the remaining hosts for this loop 30583 1726853707.62399: done getting the remaining hosts for this loop 30583 1726853707.62403: getting the next task for host managed_node2 30583 1726853707.62411: done getting next task for host managed_node2 30583 1726853707.62415: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853707.62421: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853707.62447: getting variables 30583 1726853707.62448: in VariableManager get_vars() 30583 1726853707.62491: Calling all_inventory to load vars for managed_node2 30583 1726853707.62497: Calling groups_inventory to load vars for managed_node2 30583 1726853707.62514: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853707.62525: Calling all_plugins_play to load vars for managed_node2 30583 1726853707.62528: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853707.62531: Calling groups_plugins_play to load vars for managed_node2 30583 1726853707.63724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853707.65338: done with get_vars() 30583 1726853707.65360: done getting variables 30583 1726853707.65431: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:07 -0400 (0:00:00.048) 0:00:42.991 ****** 30583 1726853707.65474: entering _queue_task() for managed_node2/fail 30583 1726853707.66093: worker is 1 (out of 1 available) 30583 1726853707.66103: exiting _queue_task() for managed_node2/fail 30583 1726853707.66113: done queuing things up, now waiting for results queue to drain 30583 1726853707.66114: waiting for pending results... 30583 1726853707.66502: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853707.67077: in run() - task 02083763-bbaf-05ea-abc5-000000000d1a 30583 1726853707.67081: variable 'ansible_search_path' from source: unknown 30583 1726853707.67090: variable 'ansible_search_path' from source: unknown 30583 1726853707.67093: calling self._execute() 30583 1726853707.67249: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853707.67268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853707.67289: variable 'omit' from source: magic vars 30583 1726853707.67764: variable 'ansible_distribution_major_version' from source: facts 30583 1726853707.67787: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853707.67953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853707.69877: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853707.70234: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853707.70289: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853707.70328: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853707.70363: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853707.70460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853707.70494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853707.70525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853707.70574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853707.70598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853707.70775: variable 'ansible_distribution_major_version' from source: facts 30583 1726853707.70778: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853707.70837: variable 'ansible_distribution' from source: facts 30583 1726853707.70847: variable '__network_rh_distros' from source: role '' defaults 30583 1726853707.70864: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853707.71122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853707.71160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853707.71199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853707.71251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853707.71286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853707.71346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853707.71387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853707.71431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853707.71576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853707.71579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853707.71582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853707.71584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853707.71606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853707.71646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853707.71676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853707.71993: variable 'network_connections' from source: include params 30583 1726853707.72007: variable 'interface' from source: play vars 30583 1726853707.72083: variable 'interface' from source: play vars 30583 1726853707.72106: variable 'network_state' from source: role '' defaults 30583 1726853707.72289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853707.72683: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853707.72686: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853707.72693: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853707.72696: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853707.72832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853707.72881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853707.72933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853707.72974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853707.73017: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853707.73035: when evaluation is False, skipping this task 30583 1726853707.73057: _execute() done 30583 1726853707.73072: dumping result to json 30583 1726853707.73106: done dumping result, returning 30583 1726853707.73110: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-000000000d1a] 30583 1726853707.73112: sending task result for task 02083763-bbaf-05ea-abc5-000000000d1a skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853707.73455: no more pending results, returning what we have 30583 1726853707.73459: results queue empty 30583 1726853707.73461: checking for any_errors_fatal 30583 1726853707.73466: done checking for any_errors_fatal 30583 1726853707.73467: checking for max_fail_percentage 30583 1726853707.73469: done checking for max_fail_percentage 30583 1726853707.73501: checking to see if all hosts have failed and the running result is not ok 30583 1726853707.73503: done checking to see if all hosts have failed 30583 1726853707.73503: getting the remaining hosts for this loop 30583 1726853707.73506: done getting the remaining hosts for this loop 30583 1726853707.73510: getting the next task for host managed_node2 30583 1726853707.73520: done getting next task for host managed_node2 30583 1726853707.73524: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853707.73531: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853707.73552: getting variables 30583 1726853707.73554: in VariableManager get_vars() 30583 1726853707.73708: Calling all_inventory to load vars for managed_node2 30583 1726853707.73711: Calling groups_inventory to load vars for managed_node2 30583 1726853707.73714: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853707.73720: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d1a 30583 1726853707.73723: WORKER PROCESS EXITING 30583 1726853707.73731: Calling all_plugins_play to load vars for managed_node2 30583 1726853707.73735: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853707.73738: Calling groups_plugins_play to load vars for managed_node2 30583 1726853707.76665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853707.79241: done with get_vars() 30583 1726853707.79267: done getting variables 30583 1726853707.79335: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:07 -0400 (0:00:00.138) 0:00:43.130 ****** 30583 1726853707.79370: entering _queue_task() for managed_node2/dnf 30583 1726853707.79861: worker is 1 (out of 1 available) 30583 1726853707.79877: exiting _queue_task() for managed_node2/dnf 30583 1726853707.79887: done queuing things up, now waiting for results queue to drain 30583 1726853707.79889: waiting for pending results... 30583 1726853707.80111: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853707.80300: in run() - task 02083763-bbaf-05ea-abc5-000000000d1b 30583 1726853707.80325: variable 'ansible_search_path' from source: unknown 30583 1726853707.80340: variable 'ansible_search_path' from source: unknown 30583 1726853707.80385: calling self._execute() 30583 1726853707.80495: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853707.80508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853707.80528: variable 'omit' from source: magic vars 30583 1726853707.80943: variable 'ansible_distribution_major_version' from source: facts 30583 1726853707.80983: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853707.81194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853707.83488: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853707.83677: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853707.83682: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853707.83685: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853707.83688: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853707.83762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853707.83801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853707.83834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853707.83881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853707.83902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853707.84022: variable 'ansible_distribution' from source: facts 30583 1726853707.84033: variable 'ansible_distribution_major_version' from source: facts 30583 1726853707.84054: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853707.84179: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853707.84314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853707.84343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853707.84375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853707.84420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853707.84440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853707.84486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853707.84676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853707.84680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853707.84682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853707.84684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853707.84687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853707.84689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853707.84691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853707.84728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853707.84747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853707.84911: variable 'network_connections' from source: include params 30583 1726853707.84930: variable 'interface' from source: play vars 30583 1726853707.84997: variable 'interface' from source: play vars 30583 1726853707.85075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853707.85257: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853707.85300: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853707.85335: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853707.85367: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853707.85414: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853707.85442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853707.85483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853707.85513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853707.85562: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853707.85800: variable 'network_connections' from source: include params 30583 1726853707.85811: variable 'interface' from source: play vars 30583 1726853707.85876: variable 'interface' from source: play vars 30583 1726853707.85906: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853707.85914: when evaluation is False, skipping this task 30583 1726853707.85921: _execute() done 30583 1726853707.85928: dumping result to json 30583 1726853707.85934: done dumping result, returning 30583 1726853707.85946: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000d1b] 30583 1726853707.85956: sending task result for task 02083763-bbaf-05ea-abc5-000000000d1b skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853707.86113: no more pending results, returning what we have 30583 1726853707.86116: results queue empty 30583 1726853707.86117: checking for any_errors_fatal 30583 1726853707.86126: done checking for any_errors_fatal 30583 1726853707.86127: checking for max_fail_percentage 30583 1726853707.86129: done checking for max_fail_percentage 30583 1726853707.86130: checking to see if all hosts have failed and the running result is not ok 30583 1726853707.86130: done checking to see if all hosts have failed 30583 1726853707.86131: getting the remaining hosts for this loop 30583 1726853707.86133: done getting the remaining hosts for this loop 30583 1726853707.86136: getting the next task for host managed_node2 30583 1726853707.86144: done getting next task for host managed_node2 30583 1726853707.86148: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853707.86153: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853707.86178: getting variables 30583 1726853707.86179: in VariableManager get_vars() 30583 1726853707.86213: Calling all_inventory to load vars for managed_node2 30583 1726853707.86216: Calling groups_inventory to load vars for managed_node2 30583 1726853707.86218: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853707.86229: Calling all_plugins_play to load vars for managed_node2 30583 1726853707.86232: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853707.86235: Calling groups_plugins_play to load vars for managed_node2 30583 1726853707.86756: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d1b 30583 1726853707.86759: WORKER PROCESS EXITING 30583 1726853707.87659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853707.89184: done with get_vars() 30583 1726853707.89210: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853707.89290: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:07 -0400 (0:00:00.099) 0:00:43.230 ****** 30583 1726853707.89324: entering _queue_task() for managed_node2/yum 30583 1726853707.89676: worker is 1 (out of 1 available) 30583 1726853707.89691: exiting _queue_task() for managed_node2/yum 30583 1726853707.89704: done queuing things up, now waiting for results queue to drain 30583 1726853707.89705: waiting for pending results... 30583 1726853707.90093: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853707.90141: in run() - task 02083763-bbaf-05ea-abc5-000000000d1c 30583 1726853707.90162: variable 'ansible_search_path' from source: unknown 30583 1726853707.90174: variable 'ansible_search_path' from source: unknown 30583 1726853707.90221: calling self._execute() 30583 1726853707.90326: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853707.90338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853707.90351: variable 'omit' from source: magic vars 30583 1726853707.90735: variable 'ansible_distribution_major_version' from source: facts 30583 1726853707.90753: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853707.90931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853707.93472: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853707.93542: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853707.93603: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853707.93643: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853707.93681: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853707.93761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853707.93801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853707.93833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853707.93884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853707.93905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853707.94076: variable 'ansible_distribution_major_version' from source: facts 30583 1726853707.94079: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853707.94081: when evaluation is False, skipping this task 30583 1726853707.94083: _execute() done 30583 1726853707.94085: dumping result to json 30583 1726853707.94089: done dumping result, returning 30583 1726853707.94091: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000d1c] 30583 1726853707.94094: sending task result for task 02083763-bbaf-05ea-abc5-000000000d1c skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853707.94217: no more pending results, returning what we have 30583 1726853707.94221: results queue empty 30583 1726853707.94222: checking for any_errors_fatal 30583 1726853707.94228: done checking for any_errors_fatal 30583 1726853707.94229: checking for max_fail_percentage 30583 1726853707.94231: done checking for max_fail_percentage 30583 1726853707.94233: checking to see if all hosts have failed and the running result is not ok 30583 1726853707.94234: done checking to see if all hosts have failed 30583 1726853707.94234: getting the remaining hosts for this loop 30583 1726853707.94236: done getting the remaining hosts for this loop 30583 1726853707.94240: getting the next task for host managed_node2 30583 1726853707.94251: done getting next task for host managed_node2 30583 1726853707.94255: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853707.94260: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853707.94285: getting variables 30583 1726853707.94287: in VariableManager get_vars() 30583 1726853707.94327: Calling all_inventory to load vars for managed_node2 30583 1726853707.94330: Calling groups_inventory to load vars for managed_node2 30583 1726853707.94333: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853707.94343: Calling all_plugins_play to load vars for managed_node2 30583 1726853707.94346: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853707.94348: Calling groups_plugins_play to load vars for managed_node2 30583 1726853707.95086: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d1c 30583 1726853707.95090: WORKER PROCESS EXITING 30583 1726853707.96438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853707.97945: done with get_vars() 30583 1726853707.97967: done getting variables 30583 1726853707.98028: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:07 -0400 (0:00:00.087) 0:00:43.317 ****** 30583 1726853707.98062: entering _queue_task() for managed_node2/fail 30583 1726853707.98403: worker is 1 (out of 1 available) 30583 1726853707.98416: exiting _queue_task() for managed_node2/fail 30583 1726853707.98428: done queuing things up, now waiting for results queue to drain 30583 1726853707.98430: waiting for pending results... 30583 1726853707.98721: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853707.98856: in run() - task 02083763-bbaf-05ea-abc5-000000000d1d 30583 1726853707.98879: variable 'ansible_search_path' from source: unknown 30583 1726853707.98889: variable 'ansible_search_path' from source: unknown 30583 1726853707.99078: calling self._execute() 30583 1726853707.99081: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853707.99085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853707.99088: variable 'omit' from source: magic vars 30583 1726853707.99435: variable 'ansible_distribution_major_version' from source: facts 30583 1726853707.99453: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853707.99580: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853707.99781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853708.01942: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853708.02031: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853708.02076: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853708.02119: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853708.02155: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853708.02239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.02279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.02311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.02359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.02382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.02435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.02468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.02500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.02544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.02573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.02678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.02681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.02683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.02714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.02735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.02917: variable 'network_connections' from source: include params 30583 1726853708.02935: variable 'interface' from source: play vars 30583 1726853708.03012: variable 'interface' from source: play vars 30583 1726853708.03089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853708.03250: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853708.03299: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853708.03336: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853708.03366: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853708.03437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853708.03440: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853708.03465: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.03498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853708.03556: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853708.03875: variable 'network_connections' from source: include params 30583 1726853708.03878: variable 'interface' from source: play vars 30583 1726853708.03888: variable 'interface' from source: play vars 30583 1726853708.03919: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853708.03927: when evaluation is False, skipping this task 30583 1726853708.03934: _execute() done 30583 1726853708.03940: dumping result to json 30583 1726853708.03947: done dumping result, returning 30583 1726853708.03959: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000d1d] 30583 1726853708.03968: sending task result for task 02083763-bbaf-05ea-abc5-000000000d1d skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853708.04138: no more pending results, returning what we have 30583 1726853708.04142: results queue empty 30583 1726853708.04143: checking for any_errors_fatal 30583 1726853708.04151: done checking for any_errors_fatal 30583 1726853708.04152: checking for max_fail_percentage 30583 1726853708.04154: done checking for max_fail_percentage 30583 1726853708.04155: checking to see if all hosts have failed and the running result is not ok 30583 1726853708.04156: done checking to see if all hosts have failed 30583 1726853708.04157: getting the remaining hosts for this loop 30583 1726853708.04159: done getting the remaining hosts for this loop 30583 1726853708.04163: getting the next task for host managed_node2 30583 1726853708.04173: done getting next task for host managed_node2 30583 1726853708.04177: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853708.04184: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853708.04206: getting variables 30583 1726853708.04208: in VariableManager get_vars() 30583 1726853708.04251: Calling all_inventory to load vars for managed_node2 30583 1726853708.04254: Calling groups_inventory to load vars for managed_node2 30583 1726853708.04257: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853708.04267: Calling all_plugins_play to load vars for managed_node2 30583 1726853708.04270: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853708.04385: Calling groups_plugins_play to load vars for managed_node2 30583 1726853708.05084: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d1d 30583 1726853708.05087: WORKER PROCESS EXITING 30583 1726853708.05881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853708.07602: done with get_vars() 30583 1726853708.07625: done getting variables 30583 1726853708.07688: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:08 -0400 (0:00:00.096) 0:00:43.414 ****** 30583 1726853708.07725: entering _queue_task() for managed_node2/package 30583 1726853708.08061: worker is 1 (out of 1 available) 30583 1726853708.08075: exiting _queue_task() for managed_node2/package 30583 1726853708.08088: done queuing things up, now waiting for results queue to drain 30583 1726853708.08090: waiting for pending results... 30583 1726853708.08374: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853708.08517: in run() - task 02083763-bbaf-05ea-abc5-000000000d1e 30583 1726853708.08536: variable 'ansible_search_path' from source: unknown 30583 1726853708.08544: variable 'ansible_search_path' from source: unknown 30583 1726853708.08588: calling self._execute() 30583 1726853708.08694: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853708.08709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853708.08723: variable 'omit' from source: magic vars 30583 1726853708.09100: variable 'ansible_distribution_major_version' from source: facts 30583 1726853708.09116: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853708.09316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853708.09576: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853708.09626: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853708.09667: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853708.09751: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853708.09879: variable 'network_packages' from source: role '' defaults 30583 1726853708.09991: variable '__network_provider_setup' from source: role '' defaults 30583 1726853708.10011: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853708.10081: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853708.10095: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853708.10165: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853708.10359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853708.12290: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853708.12351: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853708.12393: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853708.12426: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853708.12450: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853708.12678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.12682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.12685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.12688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.12690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.12706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.12735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.12764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.12817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.12838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.13073: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853708.13194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.13227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.13256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.13300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.13319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.13414: variable 'ansible_python' from source: facts 30583 1726853708.13442: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853708.13524: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853708.13611: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853708.13742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.13777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.13807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.13849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.13874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.13922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.13959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.13994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.14037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.14057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.14207: variable 'network_connections' from source: include params 30583 1726853708.14303: variable 'interface' from source: play vars 30583 1726853708.14322: variable 'interface' from source: play vars 30583 1726853708.14394: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853708.14430: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853708.14464: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.14501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853708.14557: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853708.14856: variable 'network_connections' from source: include params 30583 1726853708.14867: variable 'interface' from source: play vars 30583 1726853708.14974: variable 'interface' from source: play vars 30583 1726853708.15007: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853708.15089: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853708.15393: variable 'network_connections' from source: include params 30583 1726853708.15401: variable 'interface' from source: play vars 30583 1726853708.15458: variable 'interface' from source: play vars 30583 1726853708.15489: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853708.15576: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853708.15864: variable 'network_connections' from source: include params 30583 1726853708.15876: variable 'interface' from source: play vars 30583 1726853708.15944: variable 'interface' from source: play vars 30583 1726853708.16001: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853708.16066: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853708.16082: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853708.16148: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853708.16458: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853708.16946: variable 'network_connections' from source: include params 30583 1726853708.16956: variable 'interface' from source: play vars 30583 1726853708.17024: variable 'interface' from source: play vars 30583 1726853708.17038: variable 'ansible_distribution' from source: facts 30583 1726853708.17048: variable '__network_rh_distros' from source: role '' defaults 30583 1726853708.17059: variable 'ansible_distribution_major_version' from source: facts 30583 1726853708.17079: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853708.17521: variable 'ansible_distribution' from source: facts 30583 1726853708.17530: variable '__network_rh_distros' from source: role '' defaults 30583 1726853708.17545: variable 'ansible_distribution_major_version' from source: facts 30583 1726853708.17564: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853708.17766: variable 'ansible_distribution' from source: facts 30583 1726853708.17777: variable '__network_rh_distros' from source: role '' defaults 30583 1726853708.17779: variable 'ansible_distribution_major_version' from source: facts 30583 1726853708.17992: variable 'network_provider' from source: set_fact 30583 1726853708.18215: variable 'ansible_facts' from source: unknown 30583 1726853708.19675: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853708.19734: when evaluation is False, skipping this task 30583 1726853708.19742: _execute() done 30583 1726853708.19749: dumping result to json 30583 1726853708.19759: done dumping result, returning 30583 1726853708.19776: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-000000000d1e] 30583 1726853708.19787: sending task result for task 02083763-bbaf-05ea-abc5-000000000d1e 30583 1726853708.20251: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d1e 30583 1726853708.20254: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853708.20424: no more pending results, returning what we have 30583 1726853708.20428: results queue empty 30583 1726853708.20429: checking for any_errors_fatal 30583 1726853708.20436: done checking for any_errors_fatal 30583 1726853708.20437: checking for max_fail_percentage 30583 1726853708.20440: done checking for max_fail_percentage 30583 1726853708.20441: checking to see if all hosts have failed and the running result is not ok 30583 1726853708.20442: done checking to see if all hosts have failed 30583 1726853708.20443: getting the remaining hosts for this loop 30583 1726853708.20445: done getting the remaining hosts for this loop 30583 1726853708.20449: getting the next task for host managed_node2 30583 1726853708.20462: done getting next task for host managed_node2 30583 1726853708.20466: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853708.20473: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853708.20495: getting variables 30583 1726853708.20497: in VariableManager get_vars() 30583 1726853708.20541: Calling all_inventory to load vars for managed_node2 30583 1726853708.20544: Calling groups_inventory to load vars for managed_node2 30583 1726853708.20547: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853708.20561: Calling all_plugins_play to load vars for managed_node2 30583 1726853708.20565: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853708.20568: Calling groups_plugins_play to load vars for managed_node2 30583 1726853708.23007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853708.24655: done with get_vars() 30583 1726853708.24686: done getting variables 30583 1726853708.24750: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:08 -0400 (0:00:00.170) 0:00:43.585 ****** 30583 1726853708.24790: entering _queue_task() for managed_node2/package 30583 1726853708.25152: worker is 1 (out of 1 available) 30583 1726853708.25166: exiting _queue_task() for managed_node2/package 30583 1726853708.25379: done queuing things up, now waiting for results queue to drain 30583 1726853708.25381: waiting for pending results... 30583 1726853708.25516: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853708.25701: in run() - task 02083763-bbaf-05ea-abc5-000000000d1f 30583 1726853708.25724: variable 'ansible_search_path' from source: unknown 30583 1726853708.25734: variable 'ansible_search_path' from source: unknown 30583 1726853708.25777: calling self._execute() 30583 1726853708.25888: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853708.25900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853708.25914: variable 'omit' from source: magic vars 30583 1726853708.26301: variable 'ansible_distribution_major_version' from source: facts 30583 1726853708.26367: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853708.26445: variable 'network_state' from source: role '' defaults 30583 1726853708.26460: Evaluated conditional (network_state != {}): False 30583 1726853708.26472: when evaluation is False, skipping this task 30583 1726853708.26483: _execute() done 30583 1726853708.26491: dumping result to json 30583 1726853708.26499: done dumping result, returning 30583 1726853708.26512: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000000d1f] 30583 1726853708.26521: sending task result for task 02083763-bbaf-05ea-abc5-000000000d1f 30583 1726853708.26658: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d1f 30583 1726853708.26662: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853708.26740: no more pending results, returning what we have 30583 1726853708.26744: results queue empty 30583 1726853708.26745: checking for any_errors_fatal 30583 1726853708.26753: done checking for any_errors_fatal 30583 1726853708.26754: checking for max_fail_percentage 30583 1726853708.26756: done checking for max_fail_percentage 30583 1726853708.26757: checking to see if all hosts have failed and the running result is not ok 30583 1726853708.26757: done checking to see if all hosts have failed 30583 1726853708.26758: getting the remaining hosts for this loop 30583 1726853708.26760: done getting the remaining hosts for this loop 30583 1726853708.26764: getting the next task for host managed_node2 30583 1726853708.26775: done getting next task for host managed_node2 30583 1726853708.26779: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853708.26785: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853708.26808: getting variables 30583 1726853708.26810: in VariableManager get_vars() 30583 1726853708.26848: Calling all_inventory to load vars for managed_node2 30583 1726853708.26851: Calling groups_inventory to load vars for managed_node2 30583 1726853708.26854: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853708.26866: Calling all_plugins_play to load vars for managed_node2 30583 1726853708.26869: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853708.26980: Calling groups_plugins_play to load vars for managed_node2 30583 1726853708.28605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853708.30297: done with get_vars() 30583 1726853708.30322: done getting variables 30583 1726853708.30381: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:08 -0400 (0:00:00.056) 0:00:43.641 ****** 30583 1726853708.30414: entering _queue_task() for managed_node2/package 30583 1726853708.30744: worker is 1 (out of 1 available) 30583 1726853708.30757: exiting _queue_task() for managed_node2/package 30583 1726853708.30770: done queuing things up, now waiting for results queue to drain 30583 1726853708.30773: waiting for pending results... 30583 1726853708.31190: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853708.31220: in run() - task 02083763-bbaf-05ea-abc5-000000000d20 30583 1726853708.31239: variable 'ansible_search_path' from source: unknown 30583 1726853708.31248: variable 'ansible_search_path' from source: unknown 30583 1726853708.31294: calling self._execute() 30583 1726853708.31396: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853708.31408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853708.31422: variable 'omit' from source: magic vars 30583 1726853708.31802: variable 'ansible_distribution_major_version' from source: facts 30583 1726853708.31818: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853708.31942: variable 'network_state' from source: role '' defaults 30583 1726853708.32177: Evaluated conditional (network_state != {}): False 30583 1726853708.32180: when evaluation is False, skipping this task 30583 1726853708.32183: _execute() done 30583 1726853708.32186: dumping result to json 30583 1726853708.32189: done dumping result, returning 30583 1726853708.32192: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000000d20] 30583 1726853708.32194: sending task result for task 02083763-bbaf-05ea-abc5-000000000d20 30583 1726853708.32273: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d20 30583 1726853708.32276: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853708.32321: no more pending results, returning what we have 30583 1726853708.32324: results queue empty 30583 1726853708.32325: checking for any_errors_fatal 30583 1726853708.32333: done checking for any_errors_fatal 30583 1726853708.32333: checking for max_fail_percentage 30583 1726853708.32336: done checking for max_fail_percentage 30583 1726853708.32337: checking to see if all hosts have failed and the running result is not ok 30583 1726853708.32337: done checking to see if all hosts have failed 30583 1726853708.32338: getting the remaining hosts for this loop 30583 1726853708.32340: done getting the remaining hosts for this loop 30583 1726853708.32343: getting the next task for host managed_node2 30583 1726853708.32352: done getting next task for host managed_node2 30583 1726853708.32355: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853708.32361: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853708.32384: getting variables 30583 1726853708.32386: in VariableManager get_vars() 30583 1726853708.32424: Calling all_inventory to load vars for managed_node2 30583 1726853708.32427: Calling groups_inventory to load vars for managed_node2 30583 1726853708.32430: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853708.32441: Calling all_plugins_play to load vars for managed_node2 30583 1726853708.32445: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853708.32448: Calling groups_plugins_play to load vars for managed_node2 30583 1726853708.34290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853708.36246: done with get_vars() 30583 1726853708.36278: done getting variables 30583 1726853708.36337: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:08 -0400 (0:00:00.059) 0:00:43.700 ****** 30583 1726853708.36375: entering _queue_task() for managed_node2/service 30583 1726853708.36702: worker is 1 (out of 1 available) 30583 1726853708.36714: exiting _queue_task() for managed_node2/service 30583 1726853708.36726: done queuing things up, now waiting for results queue to drain 30583 1726853708.36727: waiting for pending results... 30583 1726853708.37006: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853708.37178: in run() - task 02083763-bbaf-05ea-abc5-000000000d21 30583 1726853708.37181: variable 'ansible_search_path' from source: unknown 30583 1726853708.37196: variable 'ansible_search_path' from source: unknown 30583 1726853708.37306: calling self._execute() 30583 1726853708.37333: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853708.37346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853708.37360: variable 'omit' from source: magic vars 30583 1726853708.37736: variable 'ansible_distribution_major_version' from source: facts 30583 1726853708.37756: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853708.37907: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853708.38109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853708.41666: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853708.41744: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853708.41986: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853708.41990: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853708.42007: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853708.42095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.42303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.42333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.42487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.42595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.42600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.42602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.42712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.42756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.42781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.42897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.42929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.42960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.43068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.43157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.43480: variable 'network_connections' from source: include params 30583 1726853708.43602: variable 'interface' from source: play vars 30583 1726853708.43679: variable 'interface' from source: play vars 30583 1726853708.43767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853708.44130: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853708.44289: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853708.44391: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853708.44440: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853708.44622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853708.44654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853708.44878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.44882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853708.44885: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853708.45376: variable 'network_connections' from source: include params 30583 1726853708.45489: variable 'interface' from source: play vars 30583 1726853708.45605: variable 'interface' from source: play vars 30583 1726853708.45686: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853708.45701: when evaluation is False, skipping this task 30583 1726853708.45711: _execute() done 30583 1726853708.45773: dumping result to json 30583 1726853708.45783: done dumping result, returning 30583 1726853708.45800: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000000d21] 30583 1726853708.45815: sending task result for task 02083763-bbaf-05ea-abc5-000000000d21 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853708.46131: no more pending results, returning what we have 30583 1726853708.46135: results queue empty 30583 1726853708.46136: checking for any_errors_fatal 30583 1726853708.46144: done checking for any_errors_fatal 30583 1726853708.46145: checking for max_fail_percentage 30583 1726853708.46147: done checking for max_fail_percentage 30583 1726853708.46148: checking to see if all hosts have failed and the running result is not ok 30583 1726853708.46149: done checking to see if all hosts have failed 30583 1726853708.46149: getting the remaining hosts for this loop 30583 1726853708.46151: done getting the remaining hosts for this loop 30583 1726853708.46155: getting the next task for host managed_node2 30583 1726853708.46163: done getting next task for host managed_node2 30583 1726853708.46168: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853708.46175: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853708.46197: getting variables 30583 1726853708.46199: in VariableManager get_vars() 30583 1726853708.46243: Calling all_inventory to load vars for managed_node2 30583 1726853708.46246: Calling groups_inventory to load vars for managed_node2 30583 1726853708.46249: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853708.46259: Calling all_plugins_play to load vars for managed_node2 30583 1726853708.46262: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853708.46265: Calling groups_plugins_play to load vars for managed_node2 30583 1726853708.47242: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d21 30583 1726853708.47246: WORKER PROCESS EXITING 30583 1726853708.50206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853708.52103: done with get_vars() 30583 1726853708.52133: done getting variables 30583 1726853708.52197: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:08 -0400 (0:00:00.158) 0:00:43.859 ****** 30583 1726853708.52234: entering _queue_task() for managed_node2/service 30583 1726853708.52708: worker is 1 (out of 1 available) 30583 1726853708.52720: exiting _queue_task() for managed_node2/service 30583 1726853708.52731: done queuing things up, now waiting for results queue to drain 30583 1726853708.52733: waiting for pending results... 30583 1726853708.52944: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853708.53106: in run() - task 02083763-bbaf-05ea-abc5-000000000d22 30583 1726853708.53126: variable 'ansible_search_path' from source: unknown 30583 1726853708.53134: variable 'ansible_search_path' from source: unknown 30583 1726853708.53182: calling self._execute() 30583 1726853708.53290: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853708.53302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853708.53317: variable 'omit' from source: magic vars 30583 1726853708.53935: variable 'ansible_distribution_major_version' from source: facts 30583 1726853708.53939: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853708.54176: variable 'network_provider' from source: set_fact 30583 1726853708.54189: variable 'network_state' from source: role '' defaults 30583 1726853708.54204: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853708.54215: variable 'omit' from source: magic vars 30583 1726853708.54292: variable 'omit' from source: magic vars 30583 1726853708.54328: variable 'network_service_name' from source: role '' defaults 30583 1726853708.54403: variable 'network_service_name' from source: role '' defaults 30583 1726853708.54518: variable '__network_provider_setup' from source: role '' defaults 30583 1726853708.54530: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853708.54603: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853708.54617: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853708.54683: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853708.55076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853708.57497: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853708.57621: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853708.57669: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853708.57710: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853708.57740: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853708.57828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.57966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.57969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.57974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.57976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.58009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.58039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.58069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.58121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.58140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.58387: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853708.58516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.58552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.58638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.58642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.58648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.58850: variable 'ansible_python' from source: facts 30583 1726853708.58875: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853708.59076: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853708.59158: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853708.59527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.59531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.59534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.59675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.59692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.59739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853708.59766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853708.59793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.59840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853708.59863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853708.60021: variable 'network_connections' from source: include params 30583 1726853708.60035: variable 'interface' from source: play vars 30583 1726853708.60127: variable 'interface' from source: play vars 30583 1726853708.60249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853708.60516: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853708.60553: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853708.60609: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853708.60663: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853708.60777: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853708.60780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853708.60814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853708.60861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853708.60917: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853708.61244: variable 'network_connections' from source: include params 30583 1726853708.61377: variable 'interface' from source: play vars 30583 1726853708.61381: variable 'interface' from source: play vars 30583 1726853708.61404: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853708.61506: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853708.61838: variable 'network_connections' from source: include params 30583 1726853708.61850: variable 'interface' from source: play vars 30583 1726853708.61927: variable 'interface' from source: play vars 30583 1726853708.61964: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853708.62050: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853708.62479: variable 'network_connections' from source: include params 30583 1726853708.62483: variable 'interface' from source: play vars 30583 1726853708.62485: variable 'interface' from source: play vars 30583 1726853708.62516: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853708.62589: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853708.62607: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853708.62669: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853708.62904: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853708.63461: variable 'network_connections' from source: include params 30583 1726853708.63484: variable 'interface' from source: play vars 30583 1726853708.63557: variable 'interface' from source: play vars 30583 1726853708.63674: variable 'ansible_distribution' from source: facts 30583 1726853708.63679: variable '__network_rh_distros' from source: role '' defaults 30583 1726853708.63682: variable 'ansible_distribution_major_version' from source: facts 30583 1726853708.63685: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853708.63803: variable 'ansible_distribution' from source: facts 30583 1726853708.63813: variable '__network_rh_distros' from source: role '' defaults 30583 1726853708.63824: variable 'ansible_distribution_major_version' from source: facts 30583 1726853708.63858: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853708.64131: variable 'ansible_distribution' from source: facts 30583 1726853708.64147: variable '__network_rh_distros' from source: role '' defaults 30583 1726853708.64162: variable 'ansible_distribution_major_version' from source: facts 30583 1726853708.64308: variable 'network_provider' from source: set_fact 30583 1726853708.64336: variable 'omit' from source: magic vars 30583 1726853708.64411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853708.64415: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853708.64435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853708.64460: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853708.64482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853708.64518: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853708.64578: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853708.64581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853708.64654: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853708.64670: Set connection var ansible_timeout to 10 30583 1726853708.64686: Set connection var ansible_connection to ssh 30583 1726853708.64698: Set connection var ansible_shell_executable to /bin/sh 30583 1726853708.64704: Set connection var ansible_shell_type to sh 30583 1726853708.64718: Set connection var ansible_pipelining to False 30583 1726853708.64754: variable 'ansible_shell_executable' from source: unknown 30583 1726853708.64776: variable 'ansible_connection' from source: unknown 30583 1726853708.64779: variable 'ansible_module_compression' from source: unknown 30583 1726853708.64781: variable 'ansible_shell_type' from source: unknown 30583 1726853708.64794: variable 'ansible_shell_executable' from source: unknown 30583 1726853708.64797: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853708.64846: variable 'ansible_pipelining' from source: unknown 30583 1726853708.64850: variable 'ansible_timeout' from source: unknown 30583 1726853708.64852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853708.64940: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853708.64966: variable 'omit' from source: magic vars 30583 1726853708.64980: starting attempt loop 30583 1726853708.64987: running the handler 30583 1726853708.65082: variable 'ansible_facts' from source: unknown 30583 1726853708.65980: _low_level_execute_command(): starting 30583 1726853708.65984: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853708.66782: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853708.66832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853708.66859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853708.66885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853708.66959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853708.68740: stdout chunk (state=3): >>>/root <<< 30583 1726853708.68874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853708.68877: stdout chunk (state=3): >>><<< 30583 1726853708.68880: stderr chunk (state=3): >>><<< 30583 1726853708.69017: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853708.69022: _low_level_execute_command(): starting 30583 1726853708.69025: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908 `" && echo ansible-tmp-1726853708.6892645-32621-154582116378908="` echo /root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908 `" ) && sleep 0' 30583 1726853708.69550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853708.69569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853708.69586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853708.69605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853708.69693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853708.69723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853708.69738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853708.69764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853708.69870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853708.71882: stdout chunk (state=3): >>>ansible-tmp-1726853708.6892645-32621-154582116378908=/root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908 <<< 30583 1726853708.72068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853708.72277: stderr chunk (state=3): >>><<< 30583 1726853708.72280: stdout chunk (state=3): >>><<< 30583 1726853708.72284: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853708.6892645-32621-154582116378908=/root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853708.72287: variable 'ansible_module_compression' from source: unknown 30583 1726853708.72289: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853708.72313: variable 'ansible_facts' from source: unknown 30583 1726853708.72668: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908/AnsiballZ_systemd.py 30583 1726853708.73097: Sending initial data 30583 1726853708.73102: Sent initial data (156 bytes) 30583 1726853708.73892: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853708.73952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853708.73991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853708.74008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853708.74034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853708.74168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853708.75812: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853708.75912: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853708.75979: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp33pjswd5 /root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908/AnsiballZ_systemd.py <<< 30583 1726853708.75983: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908/AnsiballZ_systemd.py" <<< 30583 1726853708.76084: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp33pjswd5" to remote "/root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908/AnsiballZ_systemd.py" <<< 30583 1726853708.78030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853708.78076: stdout chunk (state=3): >>><<< 30583 1726853708.78080: stderr chunk (state=3): >>><<< 30583 1726853708.78082: done transferring module to remote 30583 1726853708.78096: _low_level_execute_command(): starting 30583 1726853708.78106: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908/ /root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908/AnsiballZ_systemd.py && sleep 0' 30583 1726853708.78768: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853708.78786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853708.78800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853708.78818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853708.78887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853708.78951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853708.78975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853708.78994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853708.79172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853708.81185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853708.81189: stdout chunk (state=3): >>><<< 30583 1726853708.81191: stderr chunk (state=3): >>><<< 30583 1726853708.81246: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853708.81336: _low_level_execute_command(): starting 30583 1726853708.81342: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908/AnsiballZ_systemd.py && sleep 0' 30583 1726853708.82118: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853708.82177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853708.82190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853708.82238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853708.82283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853708.82360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853709.12328: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4648960", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3308515328", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1832081000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853709.14313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853709.14334: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853709.14463: stderr chunk (state=3): >>><<< 30583 1726853709.14466: stdout chunk (state=3): >>><<< 30583 1726853709.14784: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4648960", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3308515328", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1832081000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853709.14945: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853709.15025: _low_level_execute_command(): starting 30583 1726853709.15062: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853708.6892645-32621-154582116378908/ > /dev/null 2>&1 && sleep 0' 30583 1726853709.15774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853709.15784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853709.15795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853709.15809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853709.15821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853709.15828: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853709.15838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853709.15900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853709.15980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853709.15984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853709.16024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853709.16173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853709.18040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853709.18084: stderr chunk (state=3): >>><<< 30583 1726853709.18087: stdout chunk (state=3): >>><<< 30583 1726853709.18132: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853709.18136: handler run complete 30583 1726853709.18258: attempt loop complete, returning result 30583 1726853709.18261: _execute() done 30583 1726853709.18264: dumping result to json 30583 1726853709.18266: done dumping result, returning 30583 1726853709.18268: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-000000000d22] 30583 1726853709.18272: sending task result for task 02083763-bbaf-05ea-abc5-000000000d22 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853709.19017: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d22 30583 1726853709.19020: WORKER PROCESS EXITING 30583 1726853709.19035: no more pending results, returning what we have 30583 1726853709.19038: results queue empty 30583 1726853709.19039: checking for any_errors_fatal 30583 1726853709.19044: done checking for any_errors_fatal 30583 1726853709.19044: checking for max_fail_percentage 30583 1726853709.19046: done checking for max_fail_percentage 30583 1726853709.19047: checking to see if all hosts have failed and the running result is not ok 30583 1726853709.19048: done checking to see if all hosts have failed 30583 1726853709.19048: getting the remaining hosts for this loop 30583 1726853709.19050: done getting the remaining hosts for this loop 30583 1726853709.19053: getting the next task for host managed_node2 30583 1726853709.19062: done getting next task for host managed_node2 30583 1726853709.19066: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853709.19077: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853709.19089: getting variables 30583 1726853709.19092: in VariableManager get_vars() 30583 1726853709.19124: Calling all_inventory to load vars for managed_node2 30583 1726853709.19127: Calling groups_inventory to load vars for managed_node2 30583 1726853709.19129: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853709.19139: Calling all_plugins_play to load vars for managed_node2 30583 1726853709.19141: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853709.19144: Calling groups_plugins_play to load vars for managed_node2 30583 1726853709.20597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853709.22481: done with get_vars() 30583 1726853709.22506: done getting variables 30583 1726853709.22567: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:35:09 -0400 (0:00:00.703) 0:00:44.563 ****** 30583 1726853709.22611: entering _queue_task() for managed_node2/service 30583 1726853709.23376: worker is 1 (out of 1 available) 30583 1726853709.23389: exiting _queue_task() for managed_node2/service 30583 1726853709.23402: done queuing things up, now waiting for results queue to drain 30583 1726853709.23404: waiting for pending results... 30583 1726853709.24093: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853709.24401: in run() - task 02083763-bbaf-05ea-abc5-000000000d23 30583 1726853709.24491: variable 'ansible_search_path' from source: unknown 30583 1726853709.24495: variable 'ansible_search_path' from source: unknown 30583 1726853709.24531: calling self._execute() 30583 1726853709.24887: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853709.24896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853709.24908: variable 'omit' from source: magic vars 30583 1726853709.25389: variable 'ansible_distribution_major_version' from source: facts 30583 1726853709.25393: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853709.25616: variable 'network_provider' from source: set_fact 30583 1726853709.25627: Evaluated conditional (network_provider == "nm"): True 30583 1726853709.25951: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853709.25997: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853709.26191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853709.29304: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853709.29367: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853709.29421: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853709.29466: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853709.29506: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853709.29610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853709.29678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853709.29692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853709.29898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853709.29901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853709.29909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853709.29938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853709.29976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853709.30063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853709.30086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853709.30133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853709.30220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853709.30223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853709.30244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853709.30275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853709.30439: variable 'network_connections' from source: include params 30583 1726853709.30461: variable 'interface' from source: play vars 30583 1726853709.30742: variable 'interface' from source: play vars 30583 1726853709.30818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853709.31112: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853709.31216: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853709.31316: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853709.31348: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853709.31438: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853709.31527: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853709.31563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853709.31639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853709.31779: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853709.32393: variable 'network_connections' from source: include params 30583 1726853709.32405: variable 'interface' from source: play vars 30583 1726853709.32558: variable 'interface' from source: play vars 30583 1726853709.32628: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853709.32667: when evaluation is False, skipping this task 30583 1726853709.32701: _execute() done 30583 1726853709.32711: dumping result to json 30583 1726853709.32719: done dumping result, returning 30583 1726853709.32733: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-000000000d23] 30583 1726853709.32913: sending task result for task 02083763-bbaf-05ea-abc5-000000000d23 30583 1726853709.33046: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d23 30583 1726853709.33050: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853709.33098: no more pending results, returning what we have 30583 1726853709.33101: results queue empty 30583 1726853709.33102: checking for any_errors_fatal 30583 1726853709.33134: done checking for any_errors_fatal 30583 1726853709.33135: checking for max_fail_percentage 30583 1726853709.33138: done checking for max_fail_percentage 30583 1726853709.33139: checking to see if all hosts have failed and the running result is not ok 30583 1726853709.33140: done checking to see if all hosts have failed 30583 1726853709.33141: getting the remaining hosts for this loop 30583 1726853709.33143: done getting the remaining hosts for this loop 30583 1726853709.33147: getting the next task for host managed_node2 30583 1726853709.33156: done getting next task for host managed_node2 30583 1726853709.33160: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853709.33165: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853709.33188: getting variables 30583 1726853709.33190: in VariableManager get_vars() 30583 1726853709.33226: Calling all_inventory to load vars for managed_node2 30583 1726853709.33228: Calling groups_inventory to load vars for managed_node2 30583 1726853709.33230: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853709.33241: Calling all_plugins_play to load vars for managed_node2 30583 1726853709.33244: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853709.33247: Calling groups_plugins_play to load vars for managed_node2 30583 1726853709.35941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853709.39064: done with get_vars() 30583 1726853709.39104: done getting variables 30583 1726853709.39166: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:35:09 -0400 (0:00:00.166) 0:00:44.729 ****** 30583 1726853709.39266: entering _queue_task() for managed_node2/service 30583 1726853709.40116: worker is 1 (out of 1 available) 30583 1726853709.40128: exiting _queue_task() for managed_node2/service 30583 1726853709.40141: done queuing things up, now waiting for results queue to drain 30583 1726853709.40142: waiting for pending results... 30583 1726853709.40499: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853709.40939: in run() - task 02083763-bbaf-05ea-abc5-000000000d24 30583 1726853709.40953: variable 'ansible_search_path' from source: unknown 30583 1726853709.40959: variable 'ansible_search_path' from source: unknown 30583 1726853709.41026: calling self._execute() 30583 1726853709.41297: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853709.41304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853709.41313: variable 'omit' from source: magic vars 30583 1726853709.42010: variable 'ansible_distribution_major_version' from source: facts 30583 1726853709.42013: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853709.42130: variable 'network_provider' from source: set_fact 30583 1726853709.42133: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853709.42136: when evaluation is False, skipping this task 30583 1726853709.42138: _execute() done 30583 1726853709.42141: dumping result to json 30583 1726853709.42143: done dumping result, returning 30583 1726853709.42146: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-000000000d24] 30583 1726853709.42147: sending task result for task 02083763-bbaf-05ea-abc5-000000000d24 30583 1726853709.42299: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d24 30583 1726853709.42302: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853709.42375: no more pending results, returning what we have 30583 1726853709.42379: results queue empty 30583 1726853709.42380: checking for any_errors_fatal 30583 1726853709.42385: done checking for any_errors_fatal 30583 1726853709.42386: checking for max_fail_percentage 30583 1726853709.42388: done checking for max_fail_percentage 30583 1726853709.42389: checking to see if all hosts have failed and the running result is not ok 30583 1726853709.42389: done checking to see if all hosts have failed 30583 1726853709.42390: getting the remaining hosts for this loop 30583 1726853709.42391: done getting the remaining hosts for this loop 30583 1726853709.42394: getting the next task for host managed_node2 30583 1726853709.42402: done getting next task for host managed_node2 30583 1726853709.42405: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853709.42410: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853709.42428: getting variables 30583 1726853709.42429: in VariableManager get_vars() 30583 1726853709.42502: Calling all_inventory to load vars for managed_node2 30583 1726853709.42505: Calling groups_inventory to load vars for managed_node2 30583 1726853709.42507: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853709.42516: Calling all_plugins_play to load vars for managed_node2 30583 1726853709.42518: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853709.42521: Calling groups_plugins_play to load vars for managed_node2 30583 1726853709.44747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853709.47034: done with get_vars() 30583 1726853709.47074: done getting variables 30583 1726853709.47228: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:35:09 -0400 (0:00:00.080) 0:00:44.810 ****** 30583 1726853709.47291: entering _queue_task() for managed_node2/copy 30583 1726853709.47708: worker is 1 (out of 1 available) 30583 1726853709.47721: exiting _queue_task() for managed_node2/copy 30583 1726853709.47979: done queuing things up, now waiting for results queue to drain 30583 1726853709.47981: waiting for pending results... 30583 1726853709.48109: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853709.48312: in run() - task 02083763-bbaf-05ea-abc5-000000000d25 30583 1726853709.48320: variable 'ansible_search_path' from source: unknown 30583 1726853709.48323: variable 'ansible_search_path' from source: unknown 30583 1726853709.48326: calling self._execute() 30583 1726853709.48447: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853709.48461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853709.48479: variable 'omit' from source: magic vars 30583 1726853709.48982: variable 'ansible_distribution_major_version' from source: facts 30583 1726853709.49073: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853709.49137: variable 'network_provider' from source: set_fact 30583 1726853709.49164: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853709.49286: when evaluation is False, skipping this task 30583 1726853709.49290: _execute() done 30583 1726853709.49293: dumping result to json 30583 1726853709.49295: done dumping result, returning 30583 1726853709.49298: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-000000000d25] 30583 1726853709.49301: sending task result for task 02083763-bbaf-05ea-abc5-000000000d25 30583 1726853709.49384: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d25 30583 1726853709.49387: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853709.49551: no more pending results, returning what we have 30583 1726853709.49558: results queue empty 30583 1726853709.49559: checking for any_errors_fatal 30583 1726853709.49567: done checking for any_errors_fatal 30583 1726853709.49568: checking for max_fail_percentage 30583 1726853709.49573: done checking for max_fail_percentage 30583 1726853709.49574: checking to see if all hosts have failed and the running result is not ok 30583 1726853709.49574: done checking to see if all hosts have failed 30583 1726853709.49575: getting the remaining hosts for this loop 30583 1726853709.49577: done getting the remaining hosts for this loop 30583 1726853709.49581: getting the next task for host managed_node2 30583 1726853709.49591: done getting next task for host managed_node2 30583 1726853709.49595: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853709.49600: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853709.49632: getting variables 30583 1726853709.49634: in VariableManager get_vars() 30583 1726853709.49679: Calling all_inventory to load vars for managed_node2 30583 1726853709.49683: Calling groups_inventory to load vars for managed_node2 30583 1726853709.49686: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853709.49700: Calling all_plugins_play to load vars for managed_node2 30583 1726853709.49703: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853709.49706: Calling groups_plugins_play to load vars for managed_node2 30583 1726853709.52865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853709.59246: done with get_vars() 30583 1726853709.59286: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:35:09 -0400 (0:00:00.121) 0:00:44.931 ****** 30583 1726853709.59428: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853709.60196: worker is 1 (out of 1 available) 30583 1726853709.60207: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853709.60220: done queuing things up, now waiting for results queue to drain 30583 1726853709.60221: waiting for pending results... 30583 1726853709.60524: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853709.60691: in run() - task 02083763-bbaf-05ea-abc5-000000000d26 30583 1726853709.60718: variable 'ansible_search_path' from source: unknown 30583 1726853709.60776: variable 'ansible_search_path' from source: unknown 30583 1726853709.60786: calling self._execute() 30583 1726853709.61190: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853709.61198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853709.61201: variable 'omit' from source: magic vars 30583 1726853709.62611: variable 'ansible_distribution_major_version' from source: facts 30583 1726853709.62631: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853709.62645: variable 'omit' from source: magic vars 30583 1726853709.62934: variable 'omit' from source: magic vars 30583 1726853709.63280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853709.66527: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853709.66783: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853709.66838: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853709.66884: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853709.66950: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853709.67049: variable 'network_provider' from source: set_fact 30583 1726853709.67288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853709.67328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853709.67395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853709.67560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853709.67679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853709.67823: variable 'omit' from source: magic vars 30583 1726853709.68159: variable 'omit' from source: magic vars 30583 1726853709.68456: variable 'network_connections' from source: include params 30583 1726853709.68460: variable 'interface' from source: play vars 30583 1726853709.68609: variable 'interface' from source: play vars 30583 1726853709.68810: variable 'omit' from source: magic vars 30583 1726853709.68825: variable '__lsr_ansible_managed' from source: task vars 30583 1726853709.68909: variable '__lsr_ansible_managed' from source: task vars 30583 1726853709.69088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853709.69242: Loaded config def from plugin (lookup/template) 30583 1726853709.69247: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853709.69267: File lookup term: get_ansible_managed.j2 30583 1726853709.69270: variable 'ansible_search_path' from source: unknown 30583 1726853709.69275: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853709.69298: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853709.69302: variable 'ansible_search_path' from source: unknown 30583 1726853709.74435: variable 'ansible_managed' from source: unknown 30583 1726853709.74611: variable 'omit' from source: magic vars 30583 1726853709.74615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853709.74618: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853709.74653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853709.74732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853709.74735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853709.74737: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853709.74739: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853709.74742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853709.75082: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853709.75085: Set connection var ansible_timeout to 10 30583 1726853709.75087: Set connection var ansible_connection to ssh 30583 1726853709.75089: Set connection var ansible_shell_executable to /bin/sh 30583 1726853709.75186: Set connection var ansible_shell_type to sh 30583 1726853709.75190: Set connection var ansible_pipelining to False 30583 1726853709.75254: variable 'ansible_shell_executable' from source: unknown 30583 1726853709.75257: variable 'ansible_connection' from source: unknown 30583 1726853709.75259: variable 'ansible_module_compression' from source: unknown 30583 1726853709.75261: variable 'ansible_shell_type' from source: unknown 30583 1726853709.75263: variable 'ansible_shell_executable' from source: unknown 30583 1726853709.75266: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853709.75268: variable 'ansible_pipelining' from source: unknown 30583 1726853709.75270: variable 'ansible_timeout' from source: unknown 30583 1726853709.75275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853709.75476: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853709.75486: variable 'omit' from source: magic vars 30583 1726853709.75621: starting attempt loop 30583 1726853709.75679: running the handler 30583 1726853709.75683: _low_level_execute_command(): starting 30583 1726853709.75685: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853709.76411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853709.76419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853709.76424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853709.76437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853709.76443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853709.76448: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853709.76453: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853709.76476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853709.76479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853709.76481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853709.76590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853709.76830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853709.78476: stdout chunk (state=3): >>>/root <<< 30583 1726853709.78696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853709.78699: stdout chunk (state=3): >>><<< 30583 1726853709.78701: stderr chunk (state=3): >>><<< 30583 1726853709.78720: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853709.78962: _low_level_execute_command(): starting 30583 1726853709.78967: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686 `" && echo ansible-tmp-1726853709.7871652-32688-177023164147686="` echo /root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686 `" ) && sleep 0' 30583 1726853709.79328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853709.79332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853709.79349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853709.79365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853709.79377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853709.79385: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853709.79428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853709.79432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853709.79435: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853709.79437: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853709.79440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853709.79442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853709.79477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853709.79480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853709.79482: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853709.79484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853709.79588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853709.79591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853709.79788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853709.79888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853709.81914: stdout chunk (state=3): >>>ansible-tmp-1726853709.7871652-32688-177023164147686=/root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686 <<< 30583 1726853709.82077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853709.82080: stdout chunk (state=3): >>><<< 30583 1726853709.82083: stderr chunk (state=3): >>><<< 30583 1726853709.82085: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853709.7871652-32688-177023164147686=/root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853709.82119: variable 'ansible_module_compression' from source: unknown 30583 1726853709.82235: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853709.82238: variable 'ansible_facts' from source: unknown 30583 1726853709.82469: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686/AnsiballZ_network_connections.py 30583 1726853709.82568: Sending initial data 30583 1726853709.82574: Sent initial data (168 bytes) 30583 1726853709.83070: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853709.83081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853709.83092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853709.83200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853709.83204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853709.83226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853709.83331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853709.85022: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853709.85026: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853709.85101: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853709.85200: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpbmxt1_ez /root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686/AnsiballZ_network_connections.py <<< 30583 1726853709.85204: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686/AnsiballZ_network_connections.py" <<< 30583 1726853709.85276: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpbmxt1_ez" to remote "/root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686/AnsiballZ_network_connections.py" <<< 30583 1726853709.86413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853709.86563: stderr chunk (state=3): >>><<< 30583 1726853709.86566: stdout chunk (state=3): >>><<< 30583 1726853709.86568: done transferring module to remote 30583 1726853709.86572: _low_level_execute_command(): starting 30583 1726853709.86575: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686/ /root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686/AnsiballZ_network_connections.py && sleep 0' 30583 1726853709.87825: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853709.87898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853709.87926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853709.87942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853709.88067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853709.90026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853709.90030: stdout chunk (state=3): >>><<< 30583 1726853709.90032: stderr chunk (state=3): >>><<< 30583 1726853709.90173: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853709.90177: _low_level_execute_command(): starting 30583 1726853709.90180: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686/AnsiballZ_network_connections.py && sleep 0' 30583 1726853709.91459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853709.91481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853709.91578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853709.91603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853709.91619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853709.91777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853710.17687: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30583 1726853710.19762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853710.19766: stdout chunk (state=3): >>><<< 30583 1726853710.19768: stderr chunk (state=3): >>><<< 30583 1726853710.19914: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853710.20057: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853710.20064: _low_level_execute_command(): starting 30583 1726853710.20066: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853709.7871652-32688-177023164147686/ > /dev/null 2>&1 && sleep 0' 30583 1726853710.21136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853710.21184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853710.21345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853710.23262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853710.23266: stderr chunk (state=3): >>><<< 30583 1726853710.23268: stdout chunk (state=3): >>><<< 30583 1726853710.23477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853710.23480: handler run complete 30583 1726853710.23482: attempt loop complete, returning result 30583 1726853710.23484: _execute() done 30583 1726853710.23486: dumping result to json 30583 1726853710.23487: done dumping result, returning 30583 1726853710.23489: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-000000000d26] 30583 1726853710.23491: sending task result for task 02083763-bbaf-05ea-abc5-000000000d26 30583 1726853710.23560: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d26 30583 1726853710.23563: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72 skipped because already active 30583 1726853710.23672: no more pending results, returning what we have 30583 1726853710.23680: results queue empty 30583 1726853710.23681: checking for any_errors_fatal 30583 1726853710.23686: done checking for any_errors_fatal 30583 1726853710.23686: checking for max_fail_percentage 30583 1726853710.23688: done checking for max_fail_percentage 30583 1726853710.23689: checking to see if all hosts have failed and the running result is not ok 30583 1726853710.23690: done checking to see if all hosts have failed 30583 1726853710.23690: getting the remaining hosts for this loop 30583 1726853710.23692: done getting the remaining hosts for this loop 30583 1726853710.23695: getting the next task for host managed_node2 30583 1726853710.23702: done getting next task for host managed_node2 30583 1726853710.23706: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853710.23710: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853710.23720: getting variables 30583 1726853710.23721: in VariableManager get_vars() 30583 1726853710.23753: Calling all_inventory to load vars for managed_node2 30583 1726853710.23758: Calling groups_inventory to load vars for managed_node2 30583 1726853710.23761: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853710.23774: Calling all_plugins_play to load vars for managed_node2 30583 1726853710.23778: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853710.23785: Calling groups_plugins_play to load vars for managed_node2 30583 1726853710.25562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853710.28668: done with get_vars() 30583 1726853710.28706: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:35:10 -0400 (0:00:00.693) 0:00:45.625 ****** 30583 1726853710.28806: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853710.29219: worker is 1 (out of 1 available) 30583 1726853710.29231: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853710.29245: done queuing things up, now waiting for results queue to drain 30583 1726853710.29246: waiting for pending results... 30583 1726853710.29824: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853710.29978: in run() - task 02083763-bbaf-05ea-abc5-000000000d27 30583 1726853710.29982: variable 'ansible_search_path' from source: unknown 30583 1726853710.29984: variable 'ansible_search_path' from source: unknown 30583 1726853710.30067: calling self._execute() 30583 1726853710.30203: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853710.30214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853710.30230: variable 'omit' from source: magic vars 30583 1726853710.30680: variable 'ansible_distribution_major_version' from source: facts 30583 1726853710.30683: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853710.30764: variable 'network_state' from source: role '' defaults 30583 1726853710.30781: Evaluated conditional (network_state != {}): False 30583 1726853710.30791: when evaluation is False, skipping this task 30583 1726853710.30798: _execute() done 30583 1726853710.30805: dumping result to json 30583 1726853710.30812: done dumping result, returning 30583 1726853710.30829: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-000000000d27] 30583 1726853710.30838: sending task result for task 02083763-bbaf-05ea-abc5-000000000d27 30583 1726853710.31223: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d27 30583 1726853710.31226: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853710.31281: no more pending results, returning what we have 30583 1726853710.31285: results queue empty 30583 1726853710.31286: checking for any_errors_fatal 30583 1726853710.31294: done checking for any_errors_fatal 30583 1726853710.31295: checking for max_fail_percentage 30583 1726853710.31297: done checking for max_fail_percentage 30583 1726853710.31298: checking to see if all hosts have failed and the running result is not ok 30583 1726853710.31299: done checking to see if all hosts have failed 30583 1726853710.31299: getting the remaining hosts for this loop 30583 1726853710.31301: done getting the remaining hosts for this loop 30583 1726853710.31304: getting the next task for host managed_node2 30583 1726853710.31311: done getting next task for host managed_node2 30583 1726853710.31315: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853710.31320: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853710.31345: getting variables 30583 1726853710.31347: in VariableManager get_vars() 30583 1726853710.31387: Calling all_inventory to load vars for managed_node2 30583 1726853710.31390: Calling groups_inventory to load vars for managed_node2 30583 1726853710.31392: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853710.31405: Calling all_plugins_play to load vars for managed_node2 30583 1726853710.31408: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853710.31416: Calling groups_plugins_play to load vars for managed_node2 30583 1726853710.34006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853710.37423: done with get_vars() 30583 1726853710.37447: done getting variables 30583 1726853710.37578: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:35:10 -0400 (0:00:00.088) 0:00:45.713 ****** 30583 1726853710.37616: entering _queue_task() for managed_node2/debug 30583 1726853710.38106: worker is 1 (out of 1 available) 30583 1726853710.38117: exiting _queue_task() for managed_node2/debug 30583 1726853710.38126: done queuing things up, now waiting for results queue to drain 30583 1726853710.38127: waiting for pending results... 30583 1726853710.38339: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853710.38512: in run() - task 02083763-bbaf-05ea-abc5-000000000d28 30583 1726853710.38515: variable 'ansible_search_path' from source: unknown 30583 1726853710.38517: variable 'ansible_search_path' from source: unknown 30583 1726853710.38546: calling self._execute() 30583 1726853710.38780: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853710.38783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853710.38786: variable 'omit' from source: magic vars 30583 1726853710.39218: variable 'ansible_distribution_major_version' from source: facts 30583 1726853710.39233: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853710.39243: variable 'omit' from source: magic vars 30583 1726853710.39314: variable 'omit' from source: magic vars 30583 1726853710.39364: variable 'omit' from source: magic vars 30583 1726853710.39433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853710.39466: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853710.39542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853710.39545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853710.39547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853710.39583: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853710.39597: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853710.39605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853710.39727: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853710.39739: Set connection var ansible_timeout to 10 30583 1726853710.39747: Set connection var ansible_connection to ssh 30583 1726853710.39811: Set connection var ansible_shell_executable to /bin/sh 30583 1726853710.39814: Set connection var ansible_shell_type to sh 30583 1726853710.39817: Set connection var ansible_pipelining to False 30583 1726853710.39820: variable 'ansible_shell_executable' from source: unknown 30583 1726853710.39822: variable 'ansible_connection' from source: unknown 30583 1726853710.39828: variable 'ansible_module_compression' from source: unknown 30583 1726853710.39837: variable 'ansible_shell_type' from source: unknown 30583 1726853710.39844: variable 'ansible_shell_executable' from source: unknown 30583 1726853710.39851: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853710.39865: variable 'ansible_pipelining' from source: unknown 30583 1726853710.39878: variable 'ansible_timeout' from source: unknown 30583 1726853710.39886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853710.40087: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853710.40091: variable 'omit' from source: magic vars 30583 1726853710.40093: starting attempt loop 30583 1726853710.40096: running the handler 30583 1726853710.40228: variable '__network_connections_result' from source: set_fact 30583 1726853710.40292: handler run complete 30583 1726853710.40321: attempt loop complete, returning result 30583 1726853710.40328: _execute() done 30583 1726853710.40354: dumping result to json 30583 1726853710.40360: done dumping result, returning 30583 1726853710.40363: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-000000000d28] 30583 1726853710.40372: sending task result for task 02083763-bbaf-05ea-abc5-000000000d28 30583 1726853710.40541: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d28 30583 1726853710.40544: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72 skipped because already active" ] } 30583 1726853710.40794: no more pending results, returning what we have 30583 1726853710.40798: results queue empty 30583 1726853710.40799: checking for any_errors_fatal 30583 1726853710.40807: done checking for any_errors_fatal 30583 1726853710.40807: checking for max_fail_percentage 30583 1726853710.40810: done checking for max_fail_percentage 30583 1726853710.40811: checking to see if all hosts have failed and the running result is not ok 30583 1726853710.40812: done checking to see if all hosts have failed 30583 1726853710.40812: getting the remaining hosts for this loop 30583 1726853710.40815: done getting the remaining hosts for this loop 30583 1726853710.40819: getting the next task for host managed_node2 30583 1726853710.40828: done getting next task for host managed_node2 30583 1726853710.40832: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853710.40838: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853710.40849: getting variables 30583 1726853710.40851: in VariableManager get_vars() 30583 1726853710.41026: Calling all_inventory to load vars for managed_node2 30583 1726853710.41030: Calling groups_inventory to load vars for managed_node2 30583 1726853710.41037: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853710.41123: Calling all_plugins_play to load vars for managed_node2 30583 1726853710.41126: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853710.41129: Calling groups_plugins_play to load vars for managed_node2 30583 1726853710.43146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853710.44942: done with get_vars() 30583 1726853710.44974: done getting variables 30583 1726853710.45027: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:35:10 -0400 (0:00:00.074) 0:00:45.787 ****** 30583 1726853710.45073: entering _queue_task() for managed_node2/debug 30583 1726853710.45432: worker is 1 (out of 1 available) 30583 1726853710.45445: exiting _queue_task() for managed_node2/debug 30583 1726853710.45457: done queuing things up, now waiting for results queue to drain 30583 1726853710.45458: waiting for pending results... 30583 1726853710.45890: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853710.45895: in run() - task 02083763-bbaf-05ea-abc5-000000000d29 30583 1726853710.45953: variable 'ansible_search_path' from source: unknown 30583 1726853710.46040: variable 'ansible_search_path' from source: unknown 30583 1726853710.46044: calling self._execute() 30583 1726853710.46128: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853710.46144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853710.46165: variable 'omit' from source: magic vars 30583 1726853710.47076: variable 'ansible_distribution_major_version' from source: facts 30583 1726853710.47080: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853710.47082: variable 'omit' from source: magic vars 30583 1726853710.47363: variable 'omit' from source: magic vars 30583 1726853710.47366: variable 'omit' from source: magic vars 30583 1726853710.47368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853710.47580: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853710.47584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853710.47586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853710.47588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853710.47590: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853710.47592: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853710.47691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853710.47889: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853710.47910: Set connection var ansible_timeout to 10 30583 1726853710.48019: Set connection var ansible_connection to ssh 30583 1726853710.48028: Set connection var ansible_shell_executable to /bin/sh 30583 1726853710.48035: Set connection var ansible_shell_type to sh 30583 1726853710.48047: Set connection var ansible_pipelining to False 30583 1726853710.48080: variable 'ansible_shell_executable' from source: unknown 30583 1726853710.48087: variable 'ansible_connection' from source: unknown 30583 1726853710.48092: variable 'ansible_module_compression' from source: unknown 30583 1726853710.48097: variable 'ansible_shell_type' from source: unknown 30583 1726853710.48102: variable 'ansible_shell_executable' from source: unknown 30583 1726853710.48107: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853710.48113: variable 'ansible_pipelining' from source: unknown 30583 1726853710.48122: variable 'ansible_timeout' from source: unknown 30583 1726853710.48128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853710.48485: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853710.48489: variable 'omit' from source: magic vars 30583 1726853710.48491: starting attempt loop 30583 1726853710.48493: running the handler 30583 1726853710.48530: variable '__network_connections_result' from source: set_fact 30583 1726853710.48790: variable '__network_connections_result' from source: set_fact 30583 1726853710.48976: handler run complete 30583 1726853710.49044: attempt loop complete, returning result 30583 1726853710.49052: _execute() done 30583 1726853710.49062: dumping result to json 30583 1726853710.49073: done dumping result, returning 30583 1726853710.49086: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-000000000d29] 30583 1726853710.49095: sending task result for task 02083763-bbaf-05ea-abc5-000000000d29 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, a240f7a0-666a-4048-8567-0de2206b9c72 skipped because already active" ] } } 30583 1726853710.49444: no more pending results, returning what we have 30583 1726853710.49448: results queue empty 30583 1726853710.49449: checking for any_errors_fatal 30583 1726853710.49457: done checking for any_errors_fatal 30583 1726853710.49457: checking for max_fail_percentage 30583 1726853710.49460: done checking for max_fail_percentage 30583 1726853710.49461: checking to see if all hosts have failed and the running result is not ok 30583 1726853710.49461: done checking to see if all hosts have failed 30583 1726853710.49462: getting the remaining hosts for this loop 30583 1726853710.49464: done getting the remaining hosts for this loop 30583 1726853710.49467: getting the next task for host managed_node2 30583 1726853710.49477: done getting next task for host managed_node2 30583 1726853710.49481: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853710.49486: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853710.49498: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d29 30583 1726853710.49501: WORKER PROCESS EXITING 30583 1726853710.49533: getting variables 30583 1726853710.49535: in VariableManager get_vars() 30583 1726853710.49575: Calling all_inventory to load vars for managed_node2 30583 1726853710.49579: Calling groups_inventory to load vars for managed_node2 30583 1726853710.49587: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853710.49596: Calling all_plugins_play to load vars for managed_node2 30583 1726853710.49599: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853710.49601: Calling groups_plugins_play to load vars for managed_node2 30583 1726853710.51700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853710.54460: done with get_vars() 30583 1726853710.54493: done getting variables 30583 1726853710.54551: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:35:10 -0400 (0:00:00.097) 0:00:45.885 ****** 30583 1726853710.54794: entering _queue_task() for managed_node2/debug 30583 1726853710.55357: worker is 1 (out of 1 available) 30583 1726853710.55370: exiting _queue_task() for managed_node2/debug 30583 1726853710.55487: done queuing things up, now waiting for results queue to drain 30583 1726853710.55489: waiting for pending results... 30583 1726853710.55874: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853710.56289: in run() - task 02083763-bbaf-05ea-abc5-000000000d2a 30583 1726853710.56292: variable 'ansible_search_path' from source: unknown 30583 1726853710.56295: variable 'ansible_search_path' from source: unknown 30583 1726853710.56297: calling self._execute() 30583 1726853710.56462: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853710.56514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853710.56529: variable 'omit' from source: magic vars 30583 1726853710.57480: variable 'ansible_distribution_major_version' from source: facts 30583 1726853710.57483: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853710.57777: variable 'network_state' from source: role '' defaults 30583 1726853710.57962: Evaluated conditional (network_state != {}): False 30583 1726853710.57965: when evaluation is False, skipping this task 30583 1726853710.57968: _execute() done 30583 1726853710.57970: dumping result to json 30583 1726853710.57974: done dumping result, returning 30583 1726853710.57993: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-000000000d2a] 30583 1726853710.57996: sending task result for task 02083763-bbaf-05ea-abc5-000000000d2a skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853710.58131: no more pending results, returning what we have 30583 1726853710.58134: results queue empty 30583 1726853710.58135: checking for any_errors_fatal 30583 1726853710.58147: done checking for any_errors_fatal 30583 1726853710.58148: checking for max_fail_percentage 30583 1726853710.58150: done checking for max_fail_percentage 30583 1726853710.58151: checking to see if all hosts have failed and the running result is not ok 30583 1726853710.58152: done checking to see if all hosts have failed 30583 1726853710.58153: getting the remaining hosts for this loop 30583 1726853710.58154: done getting the remaining hosts for this loop 30583 1726853710.58158: getting the next task for host managed_node2 30583 1726853710.58168: done getting next task for host managed_node2 30583 1726853710.58173: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853710.58179: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853710.58199: getting variables 30583 1726853710.58201: in VariableManager get_vars() 30583 1726853710.58240: Calling all_inventory to load vars for managed_node2 30583 1726853710.58242: Calling groups_inventory to load vars for managed_node2 30583 1726853710.58244: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853710.58255: Calling all_plugins_play to load vars for managed_node2 30583 1726853710.58258: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853710.58261: Calling groups_plugins_play to load vars for managed_node2 30583 1726853710.58888: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d2a 30583 1726853710.58892: WORKER PROCESS EXITING 30583 1726853710.60867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853710.64216: done with get_vars() 30583 1726853710.64251: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:35:10 -0400 (0:00:00.095) 0:00:45.980 ****** 30583 1726853710.64354: entering _queue_task() for managed_node2/ping 30583 1726853710.65020: worker is 1 (out of 1 available) 30583 1726853710.65034: exiting _queue_task() for managed_node2/ping 30583 1726853710.65047: done queuing things up, now waiting for results queue to drain 30583 1726853710.65048: waiting for pending results... 30583 1726853710.65674: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853710.66136: in run() - task 02083763-bbaf-05ea-abc5-000000000d2b 30583 1726853710.66140: variable 'ansible_search_path' from source: unknown 30583 1726853710.66142: variable 'ansible_search_path' from source: unknown 30583 1726853710.66145: calling self._execute() 30583 1726853710.66353: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853710.66360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853710.66363: variable 'omit' from source: magic vars 30583 1726853710.67038: variable 'ansible_distribution_major_version' from source: facts 30583 1726853710.67089: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853710.67101: variable 'omit' from source: magic vars 30583 1726853710.67186: variable 'omit' from source: magic vars 30583 1726853710.67404: variable 'omit' from source: magic vars 30583 1726853710.67407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853710.67445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853710.67466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853710.67570: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853710.67577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853710.67657: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853710.67727: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853710.67732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853710.67845: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853710.68058: Set connection var ansible_timeout to 10 30583 1726853710.68062: Set connection var ansible_connection to ssh 30583 1726853710.68064: Set connection var ansible_shell_executable to /bin/sh 30583 1726853710.68066: Set connection var ansible_shell_type to sh 30583 1726853710.68069: Set connection var ansible_pipelining to False 30583 1726853710.68072: variable 'ansible_shell_executable' from source: unknown 30583 1726853710.68075: variable 'ansible_connection' from source: unknown 30583 1726853710.68078: variable 'ansible_module_compression' from source: unknown 30583 1726853710.68080: variable 'ansible_shell_type' from source: unknown 30583 1726853710.68083: variable 'ansible_shell_executable' from source: unknown 30583 1726853710.68086: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853710.68089: variable 'ansible_pipelining' from source: unknown 30583 1726853710.68092: variable 'ansible_timeout' from source: unknown 30583 1726853710.68094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853710.68468: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853710.68648: variable 'omit' from source: magic vars 30583 1726853710.68651: starting attempt loop 30583 1726853710.68654: running the handler 30583 1726853710.68659: _low_level_execute_command(): starting 30583 1726853710.68661: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853710.70169: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853710.70388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853710.70593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853710.72246: stdout chunk (state=3): >>>/root <<< 30583 1726853710.72330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853710.72377: stderr chunk (state=3): >>><<< 30583 1726853710.72380: stdout chunk (state=3): >>><<< 30583 1726853710.72402: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853710.72420: _low_level_execute_command(): starting 30583 1726853710.72462: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775 `" && echo ansible-tmp-1726853710.7240906-32744-114137691243775="` echo /root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775 `" ) && sleep 0' 30583 1726853710.73262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853710.73386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853710.73406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853710.73424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853710.73524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853710.75526: stdout chunk (state=3): >>>ansible-tmp-1726853710.7240906-32744-114137691243775=/root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775 <<< 30583 1726853710.75688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853710.75701: stdout chunk (state=3): >>><<< 30583 1726853710.75717: stderr chunk (state=3): >>><<< 30583 1726853710.75748: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853710.7240906-32744-114137691243775=/root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853710.76099: variable 'ansible_module_compression' from source: unknown 30583 1726853710.76102: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853710.76177: variable 'ansible_facts' from source: unknown 30583 1726853710.76221: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775/AnsiballZ_ping.py 30583 1726853710.76393: Sending initial data 30583 1726853710.76408: Sent initial data (153 bytes) 30583 1726853710.77019: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853710.77067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853710.77086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853710.77103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853710.77177: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853710.77208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853710.77225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853710.77245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853710.77396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853710.79029: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853710.79095: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853710.79187: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp1tuqlelc /root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775/AnsiballZ_ping.py <<< 30583 1726853710.79191: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775/AnsiballZ_ping.py" <<< 30583 1726853710.79367: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp1tuqlelc" to remote "/root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775/AnsiballZ_ping.py" <<< 30583 1726853710.80751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853710.80757: stderr chunk (state=3): >>><<< 30583 1726853710.80760: stdout chunk (state=3): >>><<< 30583 1726853710.80862: done transferring module to remote 30583 1726853710.80865: _low_level_execute_command(): starting 30583 1726853710.80868: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775/ /root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775/AnsiballZ_ping.py && sleep 0' 30583 1726853710.81532: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853710.81548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853710.81565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853710.81594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853710.81622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853710.81638: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853710.81653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853710.81683: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853710.81738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853710.81789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853710.81817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853710.81849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853710.81926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853710.83993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853710.84001: stdout chunk (state=3): >>><<< 30583 1726853710.84004: stderr chunk (state=3): >>><<< 30583 1726853710.84078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853710.84086: _low_level_execute_command(): starting 30583 1726853710.84092: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775/AnsiballZ_ping.py && sleep 0' 30583 1726853710.85577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853710.85581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853710.85584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853710.85640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853710.85818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853710.85943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853710.86043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853710.86119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853710.86345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853711.01943: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853711.03592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853711.03596: stdout chunk (state=3): >>><<< 30583 1726853711.03605: stderr chunk (state=3): >>><<< 30583 1726853711.03624: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853711.03650: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853711.03663: _low_level_execute_command(): starting 30583 1726853711.03668: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853710.7240906-32744-114137691243775/ > /dev/null 2>&1 && sleep 0' 30583 1726853711.04668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853711.04673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853711.04692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853711.04697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853711.04818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853711.04841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853711.04947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853711.07080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853711.07083: stdout chunk (state=3): >>><<< 30583 1726853711.07085: stderr chunk (state=3): >>><<< 30583 1726853711.07109: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853711.07120: handler run complete 30583 1726853711.07127: attempt loop complete, returning result 30583 1726853711.07130: _execute() done 30583 1726853711.07133: dumping result to json 30583 1726853711.07135: done dumping result, returning 30583 1726853711.07147: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-000000000d2b] 30583 1726853711.07150: sending task result for task 02083763-bbaf-05ea-abc5-000000000d2b 30583 1726853711.07477: done sending task result for task 02083763-bbaf-05ea-abc5-000000000d2b 30583 1726853711.07481: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853711.07539: no more pending results, returning what we have 30583 1726853711.07543: results queue empty 30583 1726853711.07544: checking for any_errors_fatal 30583 1726853711.07549: done checking for any_errors_fatal 30583 1726853711.07549: checking for max_fail_percentage 30583 1726853711.07551: done checking for max_fail_percentage 30583 1726853711.07552: checking to see if all hosts have failed and the running result is not ok 30583 1726853711.07552: done checking to see if all hosts have failed 30583 1726853711.07553: getting the remaining hosts for this loop 30583 1726853711.07554: done getting the remaining hosts for this loop 30583 1726853711.07557: getting the next task for host managed_node2 30583 1726853711.07566: done getting next task for host managed_node2 30583 1726853711.07568: ^ task is: TASK: meta (role_complete) 30583 1726853711.07574: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853711.07584: getting variables 30583 1726853711.07586: in VariableManager get_vars() 30583 1726853711.07621: Calling all_inventory to load vars for managed_node2 30583 1726853711.07623: Calling groups_inventory to load vars for managed_node2 30583 1726853711.07625: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853711.07633: Calling all_plugins_play to load vars for managed_node2 30583 1726853711.07635: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853711.07637: Calling groups_plugins_play to load vars for managed_node2 30583 1726853711.16685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853711.18312: done with get_vars() 30583 1726853711.18350: done getting variables 30583 1726853711.18417: done queuing things up, now waiting for results queue to drain 30583 1726853711.18419: results queue empty 30583 1726853711.18420: checking for any_errors_fatal 30583 1726853711.18423: done checking for any_errors_fatal 30583 1726853711.18424: checking for max_fail_percentage 30583 1726853711.18425: done checking for max_fail_percentage 30583 1726853711.18425: checking to see if all hosts have failed and the running result is not ok 30583 1726853711.18426: done checking to see if all hosts have failed 30583 1726853711.18427: getting the remaining hosts for this loop 30583 1726853711.18428: done getting the remaining hosts for this loop 30583 1726853711.18430: getting the next task for host managed_node2 30583 1726853711.18435: done getting next task for host managed_node2 30583 1726853711.18437: ^ task is: TASK: Asserts 30583 1726853711.18439: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853711.18442: getting variables 30583 1726853711.18443: in VariableManager get_vars() 30583 1726853711.18457: Calling all_inventory to load vars for managed_node2 30583 1726853711.18459: Calling groups_inventory to load vars for managed_node2 30583 1726853711.18462: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853711.18466: Calling all_plugins_play to load vars for managed_node2 30583 1726853711.18468: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853711.18472: Calling groups_plugins_play to load vars for managed_node2 30583 1726853711.19980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853711.21889: done with get_vars() 30583 1726853711.21925: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 13:35:11 -0400 (0:00:00.576) 0:00:46.557 ****** 30583 1726853711.22008: entering _queue_task() for managed_node2/include_tasks 30583 1726853711.22488: worker is 1 (out of 1 available) 30583 1726853711.22499: exiting _queue_task() for managed_node2/include_tasks 30583 1726853711.22510: done queuing things up, now waiting for results queue to drain 30583 1726853711.22512: waiting for pending results... 30583 1726853711.22752: running TaskExecutor() for managed_node2/TASK: Asserts 30583 1726853711.22911: in run() - task 02083763-bbaf-05ea-abc5-000000000a4e 30583 1726853711.22956: variable 'ansible_search_path' from source: unknown 30583 1726853711.22960: variable 'ansible_search_path' from source: unknown 30583 1726853711.23000: variable 'lsr_assert' from source: include params 30583 1726853711.23245: variable 'lsr_assert' from source: include params 30583 1726853711.23331: variable 'omit' from source: magic vars 30583 1726853711.23494: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853711.23498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853711.23549: variable 'omit' from source: magic vars 30583 1726853711.23782: variable 'ansible_distribution_major_version' from source: facts 30583 1726853711.23798: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853711.23809: variable 'item' from source: unknown 30583 1726853711.24094: variable 'item' from source: unknown 30583 1726853711.24098: variable 'item' from source: unknown 30583 1726853711.24202: variable 'item' from source: unknown 30583 1726853711.24546: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853711.24549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853711.24552: variable 'omit' from source: magic vars 30583 1726853711.25078: variable 'ansible_distribution_major_version' from source: facts 30583 1726853711.25082: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853711.25084: variable 'item' from source: unknown 30583 1726853711.25086: variable 'item' from source: unknown 30583 1726853711.25088: variable 'item' from source: unknown 30583 1726853711.25210: variable 'item' from source: unknown 30583 1726853711.25341: dumping result to json 30583 1726853711.25349: done dumping result, returning 30583 1726853711.25359: done running TaskExecutor() for managed_node2/TASK: Asserts [02083763-bbaf-05ea-abc5-000000000a4e] 30583 1726853711.25367: sending task result for task 02083763-bbaf-05ea-abc5-000000000a4e 30583 1726853711.25449: done sending task result for task 02083763-bbaf-05ea-abc5-000000000a4e 30583 1726853711.25452: WORKER PROCESS EXITING 30583 1726853711.25535: no more pending results, returning what we have 30583 1726853711.25540: in VariableManager get_vars() 30583 1726853711.25584: Calling all_inventory to load vars for managed_node2 30583 1726853711.25587: Calling groups_inventory to load vars for managed_node2 30583 1726853711.25591: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853711.25606: Calling all_plugins_play to load vars for managed_node2 30583 1726853711.25610: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853711.25648: Calling groups_plugins_play to load vars for managed_node2 30583 1726853711.27485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853711.29369: done with get_vars() 30583 1726853711.29436: variable 'ansible_search_path' from source: unknown 30583 1726853711.29438: variable 'ansible_search_path' from source: unknown 30583 1726853711.29488: variable 'ansible_search_path' from source: unknown 30583 1726853711.29490: variable 'ansible_search_path' from source: unknown 30583 1726853711.29537: we have included files to process 30583 1726853711.29539: generating all_blocks data 30583 1726853711.29541: done generating all_blocks data 30583 1726853711.29547: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30583 1726853711.29548: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30583 1726853711.29550: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30583 1726853711.29674: in VariableManager get_vars() 30583 1726853711.29696: done with get_vars() 30583 1726853711.29815: done processing included file 30583 1726853711.29817: iterating over new_blocks loaded from include file 30583 1726853711.29818: in VariableManager get_vars() 30583 1726853711.29833: done with get_vars() 30583 1726853711.29835: filtering new block on tags 30583 1726853711.29868: done filtering new block on tags 30583 1726853711.29873: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 => (item=tasks/assert_device_present.yml) 30583 1726853711.29879: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30583 1726853711.29883: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30583 1726853711.29886: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30583 1726853711.29993: in VariableManager get_vars() 30583 1726853711.30011: done with get_vars() 30583 1726853711.30233: done processing included file 30583 1726853711.30234: iterating over new_blocks loaded from include file 30583 1726853711.30235: in VariableManager get_vars() 30583 1726853711.30247: done with get_vars() 30583 1726853711.30249: filtering new block on tags 30583 1726853711.30298: done filtering new block on tags 30583 1726853711.30300: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=tasks/assert_profile_present.yml) 30583 1726853711.30304: extending task lists for all hosts with included blocks 30583 1726853711.31385: done extending task lists 30583 1726853711.31387: done processing included files 30583 1726853711.31387: results queue empty 30583 1726853711.31388: checking for any_errors_fatal 30583 1726853711.31390: done checking for any_errors_fatal 30583 1726853711.31391: checking for max_fail_percentage 30583 1726853711.31392: done checking for max_fail_percentage 30583 1726853711.31393: checking to see if all hosts have failed and the running result is not ok 30583 1726853711.31394: done checking to see if all hosts have failed 30583 1726853711.31395: getting the remaining hosts for this loop 30583 1726853711.31398: done getting the remaining hosts for this loop 30583 1726853711.31403: getting the next task for host managed_node2 30583 1726853711.31408: done getting next task for host managed_node2 30583 1726853711.31410: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30583 1726853711.31413: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853711.31422: getting variables 30583 1726853711.31423: in VariableManager get_vars() 30583 1726853711.31436: Calling all_inventory to load vars for managed_node2 30583 1726853711.31438: Calling groups_inventory to load vars for managed_node2 30583 1726853711.31441: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853711.31446: Calling all_plugins_play to load vars for managed_node2 30583 1726853711.31449: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853711.31451: Calling groups_plugins_play to load vars for managed_node2 30583 1726853711.32602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853711.34891: done with get_vars() 30583 1726853711.34917: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:35:11 -0400 (0:00:00.129) 0:00:46.687 ****** 30583 1726853711.35003: entering _queue_task() for managed_node2/include_tasks 30583 1726853711.35388: worker is 1 (out of 1 available) 30583 1726853711.35400: exiting _queue_task() for managed_node2/include_tasks 30583 1726853711.35412: done queuing things up, now waiting for results queue to drain 30583 1726853711.35414: waiting for pending results... 30583 1726853711.35870: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30583 1726853711.35877: in run() - task 02083763-bbaf-05ea-abc5-000000000e86 30583 1726853711.35881: variable 'ansible_search_path' from source: unknown 30583 1726853711.35884: variable 'ansible_search_path' from source: unknown 30583 1726853711.35923: calling self._execute() 30583 1726853711.36076: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853711.36080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853711.36083: variable 'omit' from source: magic vars 30583 1726853711.36422: variable 'ansible_distribution_major_version' from source: facts 30583 1726853711.36443: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853711.36453: _execute() done 30583 1726853711.36460: dumping result to json 30583 1726853711.36466: done dumping result, returning 30583 1726853711.36479: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-05ea-abc5-000000000e86] 30583 1726853711.36489: sending task result for task 02083763-bbaf-05ea-abc5-000000000e86 30583 1726853711.36627: done sending task result for task 02083763-bbaf-05ea-abc5-000000000e86 30583 1726853711.36630: WORKER PROCESS EXITING 30583 1726853711.36665: no more pending results, returning what we have 30583 1726853711.36672: in VariableManager get_vars() 30583 1726853711.36715: Calling all_inventory to load vars for managed_node2 30583 1726853711.36718: Calling groups_inventory to load vars for managed_node2 30583 1726853711.36722: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853711.36739: Calling all_plugins_play to load vars for managed_node2 30583 1726853711.36744: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853711.36753: Calling groups_plugins_play to load vars for managed_node2 30583 1726853711.38528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853711.40162: done with get_vars() 30583 1726853711.40189: variable 'ansible_search_path' from source: unknown 30583 1726853711.40190: variable 'ansible_search_path' from source: unknown 30583 1726853711.40201: variable 'item' from source: include params 30583 1726853711.40339: variable 'item' from source: include params 30583 1726853711.40385: we have included files to process 30583 1726853711.40387: generating all_blocks data 30583 1726853711.40388: done generating all_blocks data 30583 1726853711.40390: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853711.40391: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853711.40393: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853711.40609: done processing included file 30583 1726853711.40611: iterating over new_blocks loaded from include file 30583 1726853711.40613: in VariableManager get_vars() 30583 1726853711.40636: done with get_vars() 30583 1726853711.40638: filtering new block on tags 30583 1726853711.40664: done filtering new block on tags 30583 1726853711.40667: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30583 1726853711.40677: extending task lists for all hosts with included blocks 30583 1726853711.40847: done extending task lists 30583 1726853711.40850: done processing included files 30583 1726853711.40851: results queue empty 30583 1726853711.40852: checking for any_errors_fatal 30583 1726853711.40856: done checking for any_errors_fatal 30583 1726853711.40856: checking for max_fail_percentage 30583 1726853711.40858: done checking for max_fail_percentage 30583 1726853711.40858: checking to see if all hosts have failed and the running result is not ok 30583 1726853711.40859: done checking to see if all hosts have failed 30583 1726853711.40860: getting the remaining hosts for this loop 30583 1726853711.40861: done getting the remaining hosts for this loop 30583 1726853711.40864: getting the next task for host managed_node2 30583 1726853711.40868: done getting next task for host managed_node2 30583 1726853711.40872: ^ task is: TASK: Get stat for interface {{ interface }} 30583 1726853711.40875: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853711.40878: getting variables 30583 1726853711.40883: in VariableManager get_vars() 30583 1726853711.40897: Calling all_inventory to load vars for managed_node2 30583 1726853711.40900: Calling groups_inventory to load vars for managed_node2 30583 1726853711.40902: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853711.40908: Calling all_plugins_play to load vars for managed_node2 30583 1726853711.40910: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853711.40913: Calling groups_plugins_play to load vars for managed_node2 30583 1726853711.42580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853711.44204: done with get_vars() 30583 1726853711.44229: done getting variables 30583 1726853711.44380: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:35:11 -0400 (0:00:00.094) 0:00:46.781 ****** 30583 1726853711.44413: entering _queue_task() for managed_node2/stat 30583 1726853711.45339: worker is 1 (out of 1 available) 30583 1726853711.45376: exiting _queue_task() for managed_node2/stat 30583 1726853711.45389: done queuing things up, now waiting for results queue to drain 30583 1726853711.45391: waiting for pending results... 30583 1726853711.46152: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30583 1726853711.46159: in run() - task 02083763-bbaf-05ea-abc5-000000000ef5 30583 1726853711.46163: variable 'ansible_search_path' from source: unknown 30583 1726853711.46166: variable 'ansible_search_path' from source: unknown 30583 1726853711.46180: calling self._execute() 30583 1726853711.46307: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853711.46386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853711.46527: variable 'omit' from source: magic vars 30583 1726853711.47278: variable 'ansible_distribution_major_version' from source: facts 30583 1726853711.47282: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853711.47285: variable 'omit' from source: magic vars 30583 1726853711.47332: variable 'omit' from source: magic vars 30583 1726853711.47502: variable 'interface' from source: play vars 30583 1726853711.47583: variable 'omit' from source: magic vars 30583 1726853711.47630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853711.47696: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853711.47718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853711.47741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853711.47753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853711.47791: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853711.47794: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853711.47797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853711.47907: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853711.47912: Set connection var ansible_timeout to 10 30583 1726853711.47918: Set connection var ansible_connection to ssh 30583 1726853711.47920: Set connection var ansible_shell_executable to /bin/sh 30583 1726853711.47922: Set connection var ansible_shell_type to sh 30583 1726853711.47986: Set connection var ansible_pipelining to False 30583 1726853711.47989: variable 'ansible_shell_executable' from source: unknown 30583 1726853711.47992: variable 'ansible_connection' from source: unknown 30583 1726853711.47995: variable 'ansible_module_compression' from source: unknown 30583 1726853711.47997: variable 'ansible_shell_type' from source: unknown 30583 1726853711.47999: variable 'ansible_shell_executable' from source: unknown 30583 1726853711.48001: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853711.48003: variable 'ansible_pipelining' from source: unknown 30583 1726853711.48005: variable 'ansible_timeout' from source: unknown 30583 1726853711.48007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853711.48203: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853711.48214: variable 'omit' from source: magic vars 30583 1726853711.48248: starting attempt loop 30583 1726853711.48256: running the handler 30583 1726853711.48311: _low_level_execute_command(): starting 30583 1726853711.48315: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853711.49323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853711.49449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853711.49522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853711.49599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853711.51362: stdout chunk (state=3): >>>/root <<< 30583 1726853711.51591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853711.51602: stdout chunk (state=3): >>><<< 30583 1726853711.51628: stderr chunk (state=3): >>><<< 30583 1726853711.51751: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853711.51842: _low_level_execute_command(): starting 30583 1726853711.51847: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090 `" && echo ansible-tmp-1726853711.5171633-32782-120238241946090="` echo /root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090 `" ) && sleep 0' 30583 1726853711.53232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853711.53322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853711.53342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853711.53367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853711.53489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853711.55550: stdout chunk (state=3): >>>ansible-tmp-1726853711.5171633-32782-120238241946090=/root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090 <<< 30583 1726853711.55673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853711.55693: stdout chunk (state=3): >>><<< 30583 1726853711.55702: stderr chunk (state=3): >>><<< 30583 1726853711.55746: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853711.5171633-32782-120238241946090=/root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853711.55884: variable 'ansible_module_compression' from source: unknown 30583 1726853711.55937: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853711.56122: variable 'ansible_facts' from source: unknown 30583 1726853711.56234: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090/AnsiballZ_stat.py 30583 1726853711.56479: Sending initial data 30583 1726853711.56488: Sent initial data (153 bytes) 30583 1726853711.57376: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853711.57380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853711.57384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853711.57386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853711.57445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853711.57547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853711.59299: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853711.59358: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpsd0qbvzn /root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090/AnsiballZ_stat.py <<< 30583 1726853711.59362: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090/AnsiballZ_stat.py" <<< 30583 1726853711.59432: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpsd0qbvzn" to remote "/root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090/AnsiballZ_stat.py" <<< 30583 1726853711.60828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853711.60887: stderr chunk (state=3): >>><<< 30583 1726853711.60890: stdout chunk (state=3): >>><<< 30583 1726853711.60914: done transferring module to remote 30583 1726853711.60925: _low_level_execute_command(): starting 30583 1726853711.60935: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090/ /root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090/AnsiballZ_stat.py && sleep 0' 30583 1726853711.62247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853711.62294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853711.62360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853711.62367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853711.62468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853711.64359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853711.64676: stderr chunk (state=3): >>><<< 30583 1726853711.64680: stdout chunk (state=3): >>><<< 30583 1726853711.64683: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853711.64690: _low_level_execute_command(): starting 30583 1726853711.64692: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090/AnsiballZ_stat.py && sleep 0' 30583 1726853711.65201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853711.65226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853711.65288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853711.65338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853711.65349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853711.65360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853711.65476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853711.81276: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32026, "dev": 23, "nlink": 1, "atime": 1726853703.4845934, "mtime": 1726853703.4845934, "ctime": 1726853703.4845934, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853711.82779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853711.82799: stdout chunk (state=3): >>><<< 30583 1726853711.82813: stderr chunk (state=3): >>><<< 30583 1726853711.82837: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32026, "dev": 23, "nlink": 1, "atime": 1726853703.4845934, "mtime": 1726853703.4845934, "ctime": 1726853703.4845934, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853711.82979: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853711.82982: _low_level_execute_command(): starting 30583 1726853711.82985: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853711.5171633-32782-120238241946090/ > /dev/null 2>&1 && sleep 0' 30583 1726853711.83597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853711.83610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853711.83626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853711.83652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853711.83763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853711.83787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853711.83892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853711.85866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853711.85880: stdout chunk (state=3): >>><<< 30583 1726853711.85893: stderr chunk (state=3): >>><<< 30583 1726853711.85914: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853711.86078: handler run complete 30583 1726853711.86082: attempt loop complete, returning result 30583 1726853711.86084: _execute() done 30583 1726853711.86086: dumping result to json 30583 1726853711.86088: done dumping result, returning 30583 1726853711.86090: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [02083763-bbaf-05ea-abc5-000000000ef5] 30583 1726853711.86092: sending task result for task 02083763-bbaf-05ea-abc5-000000000ef5 30583 1726853711.86181: done sending task result for task 02083763-bbaf-05ea-abc5-000000000ef5 30583 1726853711.86184: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726853703.4845934, "block_size": 4096, "blocks": 0, "ctime": 1726853703.4845934, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 32026, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726853703.4845934, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30583 1726853711.86277: no more pending results, returning what we have 30583 1726853711.86281: results queue empty 30583 1726853711.86282: checking for any_errors_fatal 30583 1726853711.86283: done checking for any_errors_fatal 30583 1726853711.86284: checking for max_fail_percentage 30583 1726853711.86286: done checking for max_fail_percentage 30583 1726853711.86287: checking to see if all hosts have failed and the running result is not ok 30583 1726853711.86287: done checking to see if all hosts have failed 30583 1726853711.86288: getting the remaining hosts for this loop 30583 1726853711.86290: done getting the remaining hosts for this loop 30583 1726853711.86293: getting the next task for host managed_node2 30583 1726853711.86417: done getting next task for host managed_node2 30583 1726853711.86420: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30583 1726853711.86422: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853711.86427: getting variables 30583 1726853711.86428: in VariableManager get_vars() 30583 1726853711.86457: Calling all_inventory to load vars for managed_node2 30583 1726853711.86459: Calling groups_inventory to load vars for managed_node2 30583 1726853711.86462: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853711.86533: Calling all_plugins_play to load vars for managed_node2 30583 1726853711.86539: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853711.86543: Calling groups_plugins_play to load vars for managed_node2 30583 1726853711.88161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853711.91782: done with get_vars() 30583 1726853711.91890: done getting variables 30583 1726853711.91952: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853711.92124: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:35:11 -0400 (0:00:00.477) 0:00:47.258 ****** 30583 1726853711.92159: entering _queue_task() for managed_node2/assert 30583 1726853711.92856: worker is 1 (out of 1 available) 30583 1726853711.92868: exiting _queue_task() for managed_node2/assert 30583 1726853711.92881: done queuing things up, now waiting for results queue to drain 30583 1726853711.92883: waiting for pending results... 30583 1726853711.93073: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' 30583 1726853711.93196: in run() - task 02083763-bbaf-05ea-abc5-000000000e87 30583 1726853711.93215: variable 'ansible_search_path' from source: unknown 30583 1726853711.93222: variable 'ansible_search_path' from source: unknown 30583 1726853711.93261: calling self._execute() 30583 1726853711.93353: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853711.93363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853711.93378: variable 'omit' from source: magic vars 30583 1726853711.93740: variable 'ansible_distribution_major_version' from source: facts 30583 1726853711.93758: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853711.93770: variable 'omit' from source: magic vars 30583 1726853711.93829: variable 'omit' from source: magic vars 30583 1726853711.93932: variable 'interface' from source: play vars 30583 1726853711.93956: variable 'omit' from source: magic vars 30583 1726853711.94002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853711.94046: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853711.94077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853711.94101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853711.94119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853711.94157: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853711.94167: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853711.94178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853711.94353: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853711.94356: Set connection var ansible_timeout to 10 30583 1726853711.94359: Set connection var ansible_connection to ssh 30583 1726853711.94361: Set connection var ansible_shell_executable to /bin/sh 30583 1726853711.94363: Set connection var ansible_shell_type to sh 30583 1726853711.94365: Set connection var ansible_pipelining to False 30583 1726853711.94392: variable 'ansible_shell_executable' from source: unknown 30583 1726853711.94401: variable 'ansible_connection' from source: unknown 30583 1726853711.94463: variable 'ansible_module_compression' from source: unknown 30583 1726853711.94466: variable 'ansible_shell_type' from source: unknown 30583 1726853711.94469: variable 'ansible_shell_executable' from source: unknown 30583 1726853711.94473: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853711.94475: variable 'ansible_pipelining' from source: unknown 30583 1726853711.94477: variable 'ansible_timeout' from source: unknown 30583 1726853711.94479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853711.94593: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853711.94610: variable 'omit' from source: magic vars 30583 1726853711.94620: starting attempt loop 30583 1726853711.94627: running the handler 30583 1726853711.94763: variable 'interface_stat' from source: set_fact 30583 1726853711.94792: Evaluated conditional (interface_stat.stat.exists): True 30583 1726853711.94895: handler run complete 30583 1726853711.94898: attempt loop complete, returning result 30583 1726853711.94900: _execute() done 30583 1726853711.94902: dumping result to json 30583 1726853711.94905: done dumping result, returning 30583 1726853711.94907: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' [02083763-bbaf-05ea-abc5-000000000e87] 30583 1726853711.94910: sending task result for task 02083763-bbaf-05ea-abc5-000000000e87 30583 1726853711.94979: done sending task result for task 02083763-bbaf-05ea-abc5-000000000e87 30583 1726853711.94982: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853711.95050: no more pending results, returning what we have 30583 1726853711.95054: results queue empty 30583 1726853711.95055: checking for any_errors_fatal 30583 1726853711.95065: done checking for any_errors_fatal 30583 1726853711.95066: checking for max_fail_percentage 30583 1726853711.95068: done checking for max_fail_percentage 30583 1726853711.95068: checking to see if all hosts have failed and the running result is not ok 30583 1726853711.95069: done checking to see if all hosts have failed 30583 1726853711.95070: getting the remaining hosts for this loop 30583 1726853711.95073: done getting the remaining hosts for this loop 30583 1726853711.95077: getting the next task for host managed_node2 30583 1726853711.95088: done getting next task for host managed_node2 30583 1726853711.95090: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30583 1726853711.95094: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853711.95098: getting variables 30583 1726853711.95100: in VariableManager get_vars() 30583 1726853711.95133: Calling all_inventory to load vars for managed_node2 30583 1726853711.95135: Calling groups_inventory to load vars for managed_node2 30583 1726853711.95138: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853711.95149: Calling all_plugins_play to load vars for managed_node2 30583 1726853711.95152: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853711.95155: Calling groups_plugins_play to load vars for managed_node2 30583 1726853711.98372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853712.00179: done with get_vars() 30583 1726853712.00208: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:35:12 -0400 (0:00:00.081) 0:00:47.340 ****** 30583 1726853712.00313: entering _queue_task() for managed_node2/include_tasks 30583 1726853712.00709: worker is 1 (out of 1 available) 30583 1726853712.00721: exiting _queue_task() for managed_node2/include_tasks 30583 1726853712.00732: done queuing things up, now waiting for results queue to drain 30583 1726853712.00733: waiting for pending results... 30583 1726853712.01281: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 30583 1726853712.01598: in run() - task 02083763-bbaf-05ea-abc5-000000000e8b 30583 1726853712.01602: variable 'ansible_search_path' from source: unknown 30583 1726853712.01605: variable 'ansible_search_path' from source: unknown 30583 1726853712.01607: calling self._execute() 30583 1726853712.01610: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853712.01646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853712.01665: variable 'omit' from source: magic vars 30583 1726853712.02425: variable 'ansible_distribution_major_version' from source: facts 30583 1726853712.02474: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853712.02486: _execute() done 30583 1726853712.02500: dumping result to json 30583 1726853712.02514: done dumping result, returning 30583 1726853712.02557: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-05ea-abc5-000000000e8b] 30583 1726853712.02593: sending task result for task 02083763-bbaf-05ea-abc5-000000000e8b 30583 1726853712.02817: no more pending results, returning what we have 30583 1726853712.02822: in VariableManager get_vars() 30583 1726853712.02879: Calling all_inventory to load vars for managed_node2 30583 1726853712.02882: Calling groups_inventory to load vars for managed_node2 30583 1726853712.02887: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853712.02902: Calling all_plugins_play to load vars for managed_node2 30583 1726853712.02906: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853712.02909: Calling groups_plugins_play to load vars for managed_node2 30583 1726853712.03722: done sending task result for task 02083763-bbaf-05ea-abc5-000000000e8b 30583 1726853712.03726: WORKER PROCESS EXITING 30583 1726853712.04866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853712.07551: done with get_vars() 30583 1726853712.07635: variable 'ansible_search_path' from source: unknown 30583 1726853712.07637: variable 'ansible_search_path' from source: unknown 30583 1726853712.07648: variable 'item' from source: include params 30583 1726853712.07874: variable 'item' from source: include params 30583 1726853712.07913: we have included files to process 30583 1726853712.07915: generating all_blocks data 30583 1726853712.07917: done generating all_blocks data 30583 1726853712.07921: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853712.07922: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853712.07925: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853712.09016: done processing included file 30583 1726853712.09018: iterating over new_blocks loaded from include file 30583 1726853712.09020: in VariableManager get_vars() 30583 1726853712.09038: done with get_vars() 30583 1726853712.09040: filtering new block on tags 30583 1726853712.09124: done filtering new block on tags 30583 1726853712.09127: in VariableManager get_vars() 30583 1726853712.09142: done with get_vars() 30583 1726853712.09144: filtering new block on tags 30583 1726853712.09212: done filtering new block on tags 30583 1726853712.09215: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 30583 1726853712.09220: extending task lists for all hosts with included blocks 30583 1726853712.09603: done extending task lists 30583 1726853712.09608: done processing included files 30583 1726853712.09609: results queue empty 30583 1726853712.09610: checking for any_errors_fatal 30583 1726853712.09613: done checking for any_errors_fatal 30583 1726853712.09614: checking for max_fail_percentage 30583 1726853712.09615: done checking for max_fail_percentage 30583 1726853712.09616: checking to see if all hosts have failed and the running result is not ok 30583 1726853712.09617: done checking to see if all hosts have failed 30583 1726853712.09617: getting the remaining hosts for this loop 30583 1726853712.09619: done getting the remaining hosts for this loop 30583 1726853712.09621: getting the next task for host managed_node2 30583 1726853712.09626: done getting next task for host managed_node2 30583 1726853712.09628: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30583 1726853712.09631: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853712.09634: getting variables 30583 1726853712.09635: in VariableManager get_vars() 30583 1726853712.09643: Calling all_inventory to load vars for managed_node2 30583 1726853712.09646: Calling groups_inventory to load vars for managed_node2 30583 1726853712.09648: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853712.09653: Calling all_plugins_play to load vars for managed_node2 30583 1726853712.09656: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853712.09659: Calling groups_plugins_play to load vars for managed_node2 30583 1726853712.10855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853712.12640: done with get_vars() 30583 1726853712.12674: done getting variables 30583 1726853712.12717: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:35:12 -0400 (0:00:00.124) 0:00:47.464 ****** 30583 1726853712.12750: entering _queue_task() for managed_node2/set_fact 30583 1726853712.13111: worker is 1 (out of 1 available) 30583 1726853712.13123: exiting _queue_task() for managed_node2/set_fact 30583 1726853712.13136: done queuing things up, now waiting for results queue to drain 30583 1726853712.13137: waiting for pending results... 30583 1726853712.13403: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 30583 1726853712.13578: in run() - task 02083763-bbaf-05ea-abc5-000000000f13 30583 1726853712.13608: variable 'ansible_search_path' from source: unknown 30583 1726853712.13620: variable 'ansible_search_path' from source: unknown 30583 1726853712.13669: calling self._execute() 30583 1726853712.13787: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853712.13799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853712.13818: variable 'omit' from source: magic vars 30583 1726853712.14252: variable 'ansible_distribution_major_version' from source: facts 30583 1726853712.14280: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853712.14292: variable 'omit' from source: magic vars 30583 1726853712.14361: variable 'omit' from source: magic vars 30583 1726853712.14408: variable 'omit' from source: magic vars 30583 1726853712.14454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853712.14509: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853712.14578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853712.14581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853712.14583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853712.14619: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853712.14627: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853712.14636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853712.14752: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853712.14767: Set connection var ansible_timeout to 10 30583 1726853712.14778: Set connection var ansible_connection to ssh 30583 1726853712.14811: Set connection var ansible_shell_executable to /bin/sh 30583 1726853712.14814: Set connection var ansible_shell_type to sh 30583 1726853712.14819: Set connection var ansible_pipelining to False 30583 1726853712.14848: variable 'ansible_shell_executable' from source: unknown 30583 1726853712.14903: variable 'ansible_connection' from source: unknown 30583 1726853712.14907: variable 'ansible_module_compression' from source: unknown 30583 1726853712.14909: variable 'ansible_shell_type' from source: unknown 30583 1726853712.14912: variable 'ansible_shell_executable' from source: unknown 30583 1726853712.14915: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853712.14918: variable 'ansible_pipelining' from source: unknown 30583 1726853712.14921: variable 'ansible_timeout' from source: unknown 30583 1726853712.14923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853712.15067: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853712.15086: variable 'omit' from source: magic vars 30583 1726853712.15120: starting attempt loop 30583 1726853712.15123: running the handler 30583 1726853712.15126: handler run complete 30583 1726853712.15147: attempt loop complete, returning result 30583 1726853712.15229: _execute() done 30583 1726853712.15232: dumping result to json 30583 1726853712.15234: done dumping result, returning 30583 1726853712.15237: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-05ea-abc5-000000000f13] 30583 1726853712.15239: sending task result for task 02083763-bbaf-05ea-abc5-000000000f13 30583 1726853712.15318: done sending task result for task 02083763-bbaf-05ea-abc5-000000000f13 30583 1726853712.15322: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30583 1726853712.15396: no more pending results, returning what we have 30583 1726853712.15400: results queue empty 30583 1726853712.15402: checking for any_errors_fatal 30583 1726853712.15404: done checking for any_errors_fatal 30583 1726853712.15405: checking for max_fail_percentage 30583 1726853712.15407: done checking for max_fail_percentage 30583 1726853712.15408: checking to see if all hosts have failed and the running result is not ok 30583 1726853712.15409: done checking to see if all hosts have failed 30583 1726853712.15409: getting the remaining hosts for this loop 30583 1726853712.15411: done getting the remaining hosts for this loop 30583 1726853712.15415: getting the next task for host managed_node2 30583 1726853712.15425: done getting next task for host managed_node2 30583 1726853712.15428: ^ task is: TASK: Stat profile file 30583 1726853712.15436: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853712.15442: getting variables 30583 1726853712.15444: in VariableManager get_vars() 30583 1726853712.15484: Calling all_inventory to load vars for managed_node2 30583 1726853712.15488: Calling groups_inventory to load vars for managed_node2 30583 1726853712.15491: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853712.15504: Calling all_plugins_play to load vars for managed_node2 30583 1726853712.15508: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853712.15511: Calling groups_plugins_play to load vars for managed_node2 30583 1726853712.17314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853712.19215: done with get_vars() 30583 1726853712.19252: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:35:12 -0400 (0:00:00.067) 0:00:47.532 ****** 30583 1726853712.19490: entering _queue_task() for managed_node2/stat 30583 1726853712.20141: worker is 1 (out of 1 available) 30583 1726853712.20167: exiting _queue_task() for managed_node2/stat 30583 1726853712.20189: done queuing things up, now waiting for results queue to drain 30583 1726853712.20194: waiting for pending results... 30583 1726853712.20897: running TaskExecutor() for managed_node2/TASK: Stat profile file 30583 1726853712.21302: in run() - task 02083763-bbaf-05ea-abc5-000000000f14 30583 1726853712.21306: variable 'ansible_search_path' from source: unknown 30583 1726853712.21309: variable 'ansible_search_path' from source: unknown 30583 1726853712.21312: calling self._execute() 30583 1726853712.21419: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853712.21433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853712.21450: variable 'omit' from source: magic vars 30583 1726853712.21898: variable 'ansible_distribution_major_version' from source: facts 30583 1726853712.21957: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853712.21969: variable 'omit' from source: magic vars 30583 1726853712.22031: variable 'omit' from source: magic vars 30583 1726853712.22154: variable 'profile' from source: play vars 30583 1726853712.22157: variable 'interface' from source: play vars 30583 1726853712.22209: variable 'interface' from source: play vars 30583 1726853712.22234: variable 'omit' from source: magic vars 30583 1726853712.22372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853712.22376: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853712.22378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853712.22380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853712.22382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853712.22405: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853712.22412: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853712.22418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853712.22516: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853712.22777: Set connection var ansible_timeout to 10 30583 1726853712.22780: Set connection var ansible_connection to ssh 30583 1726853712.22782: Set connection var ansible_shell_executable to /bin/sh 30583 1726853712.22784: Set connection var ansible_shell_type to sh 30583 1726853712.22786: Set connection var ansible_pipelining to False 30583 1726853712.22788: variable 'ansible_shell_executable' from source: unknown 30583 1726853712.22790: variable 'ansible_connection' from source: unknown 30583 1726853712.22791: variable 'ansible_module_compression' from source: unknown 30583 1726853712.22793: variable 'ansible_shell_type' from source: unknown 30583 1726853712.22795: variable 'ansible_shell_executable' from source: unknown 30583 1726853712.22797: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853712.22799: variable 'ansible_pipelining' from source: unknown 30583 1726853712.22801: variable 'ansible_timeout' from source: unknown 30583 1726853712.22803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853712.23105: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853712.23120: variable 'omit' from source: magic vars 30583 1726853712.23183: starting attempt loop 30583 1726853712.23190: running the handler 30583 1726853712.23209: _low_level_execute_command(): starting 30583 1726853712.23219: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853712.24112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853712.24129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853712.24145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853712.24164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853712.24194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853712.24283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853712.24313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853712.24415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853712.26256: stdout chunk (state=3): >>>/root <<< 30583 1726853712.26292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853712.26553: stderr chunk (state=3): >>><<< 30583 1726853712.26556: stdout chunk (state=3): >>><<< 30583 1726853712.26560: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853712.26563: _low_level_execute_command(): starting 30583 1726853712.26566: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271 `" && echo ansible-tmp-1726853712.2649002-32817-234821910112271="` echo /root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271 `" ) && sleep 0' 30583 1726853712.27646: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853712.27666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853712.27773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853712.27807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853712.27836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853712.28016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853712.29992: stdout chunk (state=3): >>>ansible-tmp-1726853712.2649002-32817-234821910112271=/root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271 <<< 30583 1726853712.30177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853712.30188: stdout chunk (state=3): >>><<< 30583 1726853712.30191: stderr chunk (state=3): >>><<< 30583 1726853712.30404: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853712.2649002-32817-234821910112271=/root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853712.30407: variable 'ansible_module_compression' from source: unknown 30583 1726853712.30577: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853712.30580: variable 'ansible_facts' from source: unknown 30583 1726853712.30732: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271/AnsiballZ_stat.py 30583 1726853712.31204: Sending initial data 30583 1726853712.31207: Sent initial data (153 bytes) 30583 1726853712.31787: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853712.31799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853712.31810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853712.31819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853712.32000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853712.33808: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853712.33908: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853712.34035: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpp2fgl1g3 /root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271/AnsiballZ_stat.py <<< 30583 1726853712.34039: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271/AnsiballZ_stat.py" <<< 30583 1726853712.34099: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpp2fgl1g3" to remote "/root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271/AnsiballZ_stat.py" <<< 30583 1726853712.34939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853712.34943: stdout chunk (state=3): >>><<< 30583 1726853712.34947: stderr chunk (state=3): >>><<< 30583 1726853712.35013: done transferring module to remote 30583 1726853712.35024: _low_level_execute_command(): starting 30583 1726853712.35028: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271/ /root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271/AnsiballZ_stat.py && sleep 0' 30583 1726853712.35608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853712.35618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853712.35628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853712.35642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853712.35661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853712.35665: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853712.35673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853712.35760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853712.35766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853712.35867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853712.37909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853712.37913: stdout chunk (state=3): >>><<< 30583 1726853712.37916: stderr chunk (state=3): >>><<< 30583 1726853712.37918: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853712.37921: _low_level_execute_command(): starting 30583 1726853712.37923: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271/AnsiballZ_stat.py && sleep 0' 30583 1726853712.39292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853712.39392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853712.39507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853712.55404: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853712.57078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853712.57082: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853712.57085: stderr chunk (state=3): >>><<< 30583 1726853712.57087: stdout chunk (state=3): >>><<< 30583 1726853712.57089: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853712.57091: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853712.57094: _low_level_execute_command(): starting 30583 1726853712.57096: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853712.2649002-32817-234821910112271/ > /dev/null 2>&1 && sleep 0' 30583 1726853712.57731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853712.57734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853712.57737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853712.57739: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853712.57742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853712.57828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853712.57899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853712.59895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853712.59918: stderr chunk (state=3): >>><<< 30583 1726853712.59928: stdout chunk (state=3): >>><<< 30583 1726853712.59950: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853712.60076: handler run complete 30583 1726853712.60079: attempt loop complete, returning result 30583 1726853712.60081: _execute() done 30583 1726853712.60084: dumping result to json 30583 1726853712.60085: done dumping result, returning 30583 1726853712.60087: done running TaskExecutor() for managed_node2/TASK: Stat profile file [02083763-bbaf-05ea-abc5-000000000f14] 30583 1726853712.60089: sending task result for task 02083763-bbaf-05ea-abc5-000000000f14 30583 1726853712.60164: done sending task result for task 02083763-bbaf-05ea-abc5-000000000f14 30583 1726853712.60167: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30583 1726853712.60226: no more pending results, returning what we have 30583 1726853712.60229: results queue empty 30583 1726853712.60230: checking for any_errors_fatal 30583 1726853712.60240: done checking for any_errors_fatal 30583 1726853712.60240: checking for max_fail_percentage 30583 1726853712.60242: done checking for max_fail_percentage 30583 1726853712.60243: checking to see if all hosts have failed and the running result is not ok 30583 1726853712.60244: done checking to see if all hosts have failed 30583 1726853712.60244: getting the remaining hosts for this loop 30583 1726853712.60246: done getting the remaining hosts for this loop 30583 1726853712.60250: getting the next task for host managed_node2 30583 1726853712.60259: done getting next task for host managed_node2 30583 1726853712.60261: ^ task is: TASK: Set NM profile exist flag based on the profile files 30583 1726853712.60266: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853712.60270: getting variables 30583 1726853712.60475: in VariableManager get_vars() 30583 1726853712.60506: Calling all_inventory to load vars for managed_node2 30583 1726853712.60508: Calling groups_inventory to load vars for managed_node2 30583 1726853712.60512: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853712.60521: Calling all_plugins_play to load vars for managed_node2 30583 1726853712.60523: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853712.60526: Calling groups_plugins_play to load vars for managed_node2 30583 1726853712.61875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853712.66345: done with get_vars() 30583 1726853712.66420: done getting variables 30583 1726853712.66538: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:35:12 -0400 (0:00:00.470) 0:00:48.002 ****** 30583 1726853712.66578: entering _queue_task() for managed_node2/set_fact 30583 1726853712.67307: worker is 1 (out of 1 available) 30583 1726853712.67319: exiting _queue_task() for managed_node2/set_fact 30583 1726853712.67333: done queuing things up, now waiting for results queue to drain 30583 1726853712.67334: waiting for pending results... 30583 1726853712.67793: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 30583 1726853712.67799: in run() - task 02083763-bbaf-05ea-abc5-000000000f15 30583 1726853712.67803: variable 'ansible_search_path' from source: unknown 30583 1726853712.67805: variable 'ansible_search_path' from source: unknown 30583 1726853712.67977: calling self._execute() 30583 1726853712.67981: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853712.67984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853712.67987: variable 'omit' from source: magic vars 30583 1726853712.68314: variable 'ansible_distribution_major_version' from source: facts 30583 1726853712.68326: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853712.68456: variable 'profile_stat' from source: set_fact 30583 1726853712.68477: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853712.68481: when evaluation is False, skipping this task 30583 1726853712.68483: _execute() done 30583 1726853712.68486: dumping result to json 30583 1726853712.68489: done dumping result, returning 30583 1726853712.68496: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-05ea-abc5-000000000f15] 30583 1726853712.68501: sending task result for task 02083763-bbaf-05ea-abc5-000000000f15 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853712.68640: no more pending results, returning what we have 30583 1726853712.68644: results queue empty 30583 1726853712.68645: checking for any_errors_fatal 30583 1726853712.68659: done checking for any_errors_fatal 30583 1726853712.68660: checking for max_fail_percentage 30583 1726853712.68662: done checking for max_fail_percentage 30583 1726853712.68663: checking to see if all hosts have failed and the running result is not ok 30583 1726853712.68663: done checking to see if all hosts have failed 30583 1726853712.68664: getting the remaining hosts for this loop 30583 1726853712.68666: done getting the remaining hosts for this loop 30583 1726853712.68670: getting the next task for host managed_node2 30583 1726853712.68785: done getting next task for host managed_node2 30583 1726853712.68789: ^ task is: TASK: Get NM profile info 30583 1726853712.68795: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853712.68800: getting variables 30583 1726853712.68802: in VariableManager get_vars() 30583 1726853712.68842: Calling all_inventory to load vars for managed_node2 30583 1726853712.68845: Calling groups_inventory to load vars for managed_node2 30583 1726853712.68849: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853712.68868: Calling all_plugins_play to load vars for managed_node2 30583 1726853712.68994: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853712.69000: Calling groups_plugins_play to load vars for managed_node2 30583 1726853712.69523: done sending task result for task 02083763-bbaf-05ea-abc5-000000000f15 30583 1726853712.69527: WORKER PROCESS EXITING 30583 1726853712.70732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853712.72377: done with get_vars() 30583 1726853712.72406: done getting variables 30583 1726853712.72534: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:35:12 -0400 (0:00:00.059) 0:00:48.062 ****** 30583 1726853712.72566: entering _queue_task() for managed_node2/shell 30583 1726853712.72932: worker is 1 (out of 1 available) 30583 1726853712.72944: exiting _queue_task() for managed_node2/shell 30583 1726853712.73176: done queuing things up, now waiting for results queue to drain 30583 1726853712.73178: waiting for pending results... 30583 1726853712.73311: running TaskExecutor() for managed_node2/TASK: Get NM profile info 30583 1726853712.73410: in run() - task 02083763-bbaf-05ea-abc5-000000000f16 30583 1726853712.73433: variable 'ansible_search_path' from source: unknown 30583 1726853712.73441: variable 'ansible_search_path' from source: unknown 30583 1726853712.73484: calling self._execute() 30583 1726853712.73613: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853712.73622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853712.73626: variable 'omit' from source: magic vars 30583 1726853712.74097: variable 'ansible_distribution_major_version' from source: facts 30583 1726853712.74161: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853712.74168: variable 'omit' from source: magic vars 30583 1726853712.74212: variable 'omit' from source: magic vars 30583 1726853712.74346: variable 'profile' from source: play vars 30583 1726853712.74363: variable 'interface' from source: play vars 30583 1726853712.74440: variable 'interface' from source: play vars 30583 1726853712.74467: variable 'omit' from source: magic vars 30583 1726853712.74576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853712.74580: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853712.74628: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853712.74660: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853712.74679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853712.74866: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853712.74868: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853712.74872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853712.75047: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853712.75050: Set connection var ansible_timeout to 10 30583 1726853712.75053: Set connection var ansible_connection to ssh 30583 1726853712.75062: Set connection var ansible_shell_executable to /bin/sh 30583 1726853712.75065: Set connection var ansible_shell_type to sh 30583 1726853712.75076: Set connection var ansible_pipelining to False 30583 1726853712.75107: variable 'ansible_shell_executable' from source: unknown 30583 1726853712.75110: variable 'ansible_connection' from source: unknown 30583 1726853712.75113: variable 'ansible_module_compression' from source: unknown 30583 1726853712.75116: variable 'ansible_shell_type' from source: unknown 30583 1726853712.75119: variable 'ansible_shell_executable' from source: unknown 30583 1726853712.75121: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853712.75123: variable 'ansible_pipelining' from source: unknown 30583 1726853712.75126: variable 'ansible_timeout' from source: unknown 30583 1726853712.75128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853712.75268: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853712.75280: variable 'omit' from source: magic vars 30583 1726853712.75285: starting attempt loop 30583 1726853712.75288: running the handler 30583 1726853712.75298: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853712.75323: _low_level_execute_command(): starting 30583 1726853712.75330: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853712.76520: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853712.76630: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853712.76721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853712.78443: stdout chunk (state=3): >>>/root <<< 30583 1726853712.78624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853712.78627: stdout chunk (state=3): >>><<< 30583 1726853712.78629: stderr chunk (state=3): >>><<< 30583 1726853712.78745: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853712.78750: _low_level_execute_command(): starting 30583 1726853712.78754: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604 `" && echo ansible-tmp-1726853712.7865367-32843-153726714982604="` echo /root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604 `" ) && sleep 0' 30583 1726853712.79375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853712.79392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853712.79408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853712.79538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853712.79553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853712.79606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853712.79624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853712.79654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853712.79992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853712.81786: stdout chunk (state=3): >>>ansible-tmp-1726853712.7865367-32843-153726714982604=/root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604 <<< 30583 1726853712.82118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853712.82121: stdout chunk (state=3): >>><<< 30583 1726853712.82124: stderr chunk (state=3): >>><<< 30583 1726853712.82284: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853712.7865367-32843-153726714982604=/root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853712.82287: variable 'ansible_module_compression' from source: unknown 30583 1726853712.82289: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853712.82291: variable 'ansible_facts' from source: unknown 30583 1726853712.82661: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604/AnsiballZ_command.py 30583 1726853712.83299: Sending initial data 30583 1726853712.83302: Sent initial data (156 bytes) 30583 1726853712.85105: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853712.85210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853712.85301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853712.85329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853712.85444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853712.87158: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853712.87397: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604/AnsiballZ_command.py" <<< 30583 1726853712.87401: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpvo3sp1xd /root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604/AnsiballZ_command.py <<< 30583 1726853712.87474: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpvo3sp1xd" to remote "/root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604/AnsiballZ_command.py" <<< 30583 1726853712.88653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853712.88686: stderr chunk (state=3): >>><<< 30583 1726853712.88700: stdout chunk (state=3): >>><<< 30583 1726853712.88733: done transferring module to remote 30583 1726853712.88748: _low_level_execute_command(): starting 30583 1726853712.88758: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604/ /root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604/AnsiballZ_command.py && sleep 0' 30583 1726853712.89459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853712.89477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853712.89670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853712.89764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853712.91731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853712.91735: stdout chunk (state=3): >>><<< 30583 1726853712.91741: stderr chunk (state=3): >>><<< 30583 1726853712.91762: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853712.91766: _low_level_execute_command(): starting 30583 1726853712.91854: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604/AnsiballZ_command.py && sleep 0' 30583 1726853712.92423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853712.92436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853712.92448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853712.92465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853712.92482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853712.92520: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853712.92593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853712.92607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853712.92643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853712.92749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853713.09944: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 13:35:13.081099", "end": "2024-09-20 13:35:13.098267", "delta": "0:00:00.017168", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853713.11779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853713.11784: stdout chunk (state=3): >>><<< 30583 1726853713.11787: stderr chunk (state=3): >>><<< 30583 1726853713.11792: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 13:35:13.081099", "end": "2024-09-20 13:35:13.098267", "delta": "0:00:00.017168", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853713.11795: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853713.11798: _low_level_execute_command(): starting 30583 1726853713.11801: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853712.7865367-32843-153726714982604/ > /dev/null 2>&1 && sleep 0' 30583 1726853713.13389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853713.13459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853713.13468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853713.13569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853713.13681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853713.13690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853713.13811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853713.15720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853713.15784: stderr chunk (state=3): >>><<< 30583 1726853713.15788: stdout chunk (state=3): >>><<< 30583 1726853713.15809: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853713.15823: handler run complete 30583 1726853713.15857: Evaluated conditional (False): False 30583 1726853713.16075: attempt loop complete, returning result 30583 1726853713.16078: _execute() done 30583 1726853713.16080: dumping result to json 30583 1726853713.16086: done dumping result, returning 30583 1726853713.16088: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [02083763-bbaf-05ea-abc5-000000000f16] 30583 1726853713.16089: sending task result for task 02083763-bbaf-05ea-abc5-000000000f16 30583 1726853713.16175: done sending task result for task 02083763-bbaf-05ea-abc5-000000000f16 30583 1726853713.16178: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.017168", "end": "2024-09-20 13:35:13.098267", "rc": 0, "start": "2024-09-20 13:35:13.081099" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30583 1726853713.16277: no more pending results, returning what we have 30583 1726853713.16281: results queue empty 30583 1726853713.16282: checking for any_errors_fatal 30583 1726853713.16290: done checking for any_errors_fatal 30583 1726853713.16291: checking for max_fail_percentage 30583 1726853713.16293: done checking for max_fail_percentage 30583 1726853713.16294: checking to see if all hosts have failed and the running result is not ok 30583 1726853713.16295: done checking to see if all hosts have failed 30583 1726853713.16295: getting the remaining hosts for this loop 30583 1726853713.16297: done getting the remaining hosts for this loop 30583 1726853713.16301: getting the next task for host managed_node2 30583 1726853713.16317: done getting next task for host managed_node2 30583 1726853713.16320: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30583 1726853713.16325: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853713.16331: getting variables 30583 1726853713.16333: in VariableManager get_vars() 30583 1726853713.16576: Calling all_inventory to load vars for managed_node2 30583 1726853713.16580: Calling groups_inventory to load vars for managed_node2 30583 1726853713.16584: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853713.16599: Calling all_plugins_play to load vars for managed_node2 30583 1726853713.16603: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853713.16607: Calling groups_plugins_play to load vars for managed_node2 30583 1726853713.18606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853713.21158: done with get_vars() 30583 1726853713.21189: done getting variables 30583 1726853713.21253: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:35:13 -0400 (0:00:00.487) 0:00:48.550 ****** 30583 1726853713.21298: entering _queue_task() for managed_node2/set_fact 30583 1726853713.21765: worker is 1 (out of 1 available) 30583 1726853713.21781: exiting _queue_task() for managed_node2/set_fact 30583 1726853713.21796: done queuing things up, now waiting for results queue to drain 30583 1726853713.21797: waiting for pending results... 30583 1726853713.22095: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30583 1726853713.22234: in run() - task 02083763-bbaf-05ea-abc5-000000000f17 30583 1726853713.22297: variable 'ansible_search_path' from source: unknown 30583 1726853713.22478: variable 'ansible_search_path' from source: unknown 30583 1726853713.22484: calling self._execute() 30583 1726853713.22605: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.22626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.22647: variable 'omit' from source: magic vars 30583 1726853713.23427: variable 'ansible_distribution_major_version' from source: facts 30583 1726853713.23440: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853713.23625: variable 'nm_profile_exists' from source: set_fact 30583 1726853713.23639: Evaluated conditional (nm_profile_exists.rc == 0): True 30583 1726853713.23645: variable 'omit' from source: magic vars 30583 1726853713.23977: variable 'omit' from source: magic vars 30583 1726853713.24004: variable 'omit' from source: magic vars 30583 1726853713.24047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853713.24298: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853713.24320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853713.24348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853713.24352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853713.24450: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853713.24453: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.24456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.24843: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853713.24846: Set connection var ansible_timeout to 10 30583 1726853713.24849: Set connection var ansible_connection to ssh 30583 1726853713.24851: Set connection var ansible_shell_executable to /bin/sh 30583 1726853713.24853: Set connection var ansible_shell_type to sh 30583 1726853713.24858: Set connection var ansible_pipelining to False 30583 1726853713.24892: variable 'ansible_shell_executable' from source: unknown 30583 1726853713.24916: variable 'ansible_connection' from source: unknown 30583 1726853713.24923: variable 'ansible_module_compression' from source: unknown 30583 1726853713.24929: variable 'ansible_shell_type' from source: unknown 30583 1726853713.24934: variable 'ansible_shell_executable' from source: unknown 30583 1726853713.24957: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.24967: variable 'ansible_pipelining' from source: unknown 30583 1726853713.25228: variable 'ansible_timeout' from source: unknown 30583 1726853713.25231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.25304: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853713.25320: variable 'omit' from source: magic vars 30583 1726853713.25330: starting attempt loop 30583 1726853713.25336: running the handler 30583 1726853713.25352: handler run complete 30583 1726853713.25370: attempt loop complete, returning result 30583 1726853713.25385: _execute() done 30583 1726853713.25396: dumping result to json 30583 1726853713.25403: done dumping result, returning 30583 1726853713.25417: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-05ea-abc5-000000000f17] 30583 1726853713.25426: sending task result for task 02083763-bbaf-05ea-abc5-000000000f17 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30583 1726853713.25664: no more pending results, returning what we have 30583 1726853713.25668: results queue empty 30583 1726853713.25669: checking for any_errors_fatal 30583 1726853713.25681: done checking for any_errors_fatal 30583 1726853713.25682: checking for max_fail_percentage 30583 1726853713.25684: done checking for max_fail_percentage 30583 1726853713.25685: checking to see if all hosts have failed and the running result is not ok 30583 1726853713.25686: done checking to see if all hosts have failed 30583 1726853713.25686: getting the remaining hosts for this loop 30583 1726853713.25688: done getting the remaining hosts for this loop 30583 1726853713.25693: getting the next task for host managed_node2 30583 1726853713.25707: done getting next task for host managed_node2 30583 1726853713.25710: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30583 1726853713.25717: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853713.25721: getting variables 30583 1726853713.25722: in VariableManager get_vars() 30583 1726853713.25764: Calling all_inventory to load vars for managed_node2 30583 1726853713.25767: Calling groups_inventory to load vars for managed_node2 30583 1726853713.25807: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853713.25821: Calling all_plugins_play to load vars for managed_node2 30583 1726853713.25825: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853713.25828: Calling groups_plugins_play to load vars for managed_node2 30583 1726853713.26530: done sending task result for task 02083763-bbaf-05ea-abc5-000000000f17 30583 1726853713.26534: WORKER PROCESS EXITING 30583 1726853713.27685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853713.31404: done with get_vars() 30583 1726853713.31439: done getting variables 30583 1726853713.31514: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853713.31654: variable 'profile' from source: play vars 30583 1726853713.31661: variable 'interface' from source: play vars 30583 1726853713.31723: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:35:13 -0400 (0:00:00.104) 0:00:48.654 ****** 30583 1726853713.31759: entering _queue_task() for managed_node2/command 30583 1726853713.32212: worker is 1 (out of 1 available) 30583 1726853713.32226: exiting _queue_task() for managed_node2/command 30583 1726853713.32368: done queuing things up, now waiting for results queue to drain 30583 1726853713.32370: waiting for pending results... 30583 1726853713.32585: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 30583 1726853713.32737: in run() - task 02083763-bbaf-05ea-abc5-000000000f19 30583 1726853713.32761: variable 'ansible_search_path' from source: unknown 30583 1726853713.32770: variable 'ansible_search_path' from source: unknown 30583 1726853713.32823: calling self._execute() 30583 1726853713.32959: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.32981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.32998: variable 'omit' from source: magic vars 30583 1726853713.33441: variable 'ansible_distribution_major_version' from source: facts 30583 1726853713.33475: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853713.33629: variable 'profile_stat' from source: set_fact 30583 1726853713.33673: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853713.33689: when evaluation is False, skipping this task 30583 1726853713.33697: _execute() done 30583 1726853713.33876: dumping result to json 30583 1726853713.33880: done dumping result, returning 30583 1726853713.33883: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000000f19] 30583 1726853713.33885: sending task result for task 02083763-bbaf-05ea-abc5-000000000f19 30583 1726853713.33954: done sending task result for task 02083763-bbaf-05ea-abc5-000000000f19 30583 1726853713.33960: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853713.34020: no more pending results, returning what we have 30583 1726853713.34024: results queue empty 30583 1726853713.34025: checking for any_errors_fatal 30583 1726853713.34035: done checking for any_errors_fatal 30583 1726853713.34036: checking for max_fail_percentage 30583 1726853713.34039: done checking for max_fail_percentage 30583 1726853713.34040: checking to see if all hosts have failed and the running result is not ok 30583 1726853713.34041: done checking to see if all hosts have failed 30583 1726853713.34041: getting the remaining hosts for this loop 30583 1726853713.34043: done getting the remaining hosts for this loop 30583 1726853713.34048: getting the next task for host managed_node2 30583 1726853713.34062: done getting next task for host managed_node2 30583 1726853713.34065: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30583 1726853713.34073: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853713.34078: getting variables 30583 1726853713.34080: in VariableManager get_vars() 30583 1726853713.34120: Calling all_inventory to load vars for managed_node2 30583 1726853713.34124: Calling groups_inventory to load vars for managed_node2 30583 1726853713.34128: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853713.34142: Calling all_plugins_play to load vars for managed_node2 30583 1726853713.34146: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853713.34149: Calling groups_plugins_play to load vars for managed_node2 30583 1726853713.36125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853713.37853: done with get_vars() 30583 1726853713.37895: done getting variables 30583 1726853713.37963: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853713.38097: variable 'profile' from source: play vars 30583 1726853713.38102: variable 'interface' from source: play vars 30583 1726853713.38161: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:35:13 -0400 (0:00:00.064) 0:00:48.719 ****** 30583 1726853713.38203: entering _queue_task() for managed_node2/set_fact 30583 1726853713.38631: worker is 1 (out of 1 available) 30583 1726853713.38647: exiting _queue_task() for managed_node2/set_fact 30583 1726853713.38660: done queuing things up, now waiting for results queue to drain 30583 1726853713.38662: waiting for pending results... 30583 1726853713.38906: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 30583 1726853713.39001: in run() - task 02083763-bbaf-05ea-abc5-000000000f1a 30583 1726853713.39014: variable 'ansible_search_path' from source: unknown 30583 1726853713.39017: variable 'ansible_search_path' from source: unknown 30583 1726853713.39047: calling self._execute() 30583 1726853713.39123: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.39126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.39136: variable 'omit' from source: magic vars 30583 1726853713.39411: variable 'ansible_distribution_major_version' from source: facts 30583 1726853713.39424: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853713.39509: variable 'profile_stat' from source: set_fact 30583 1726853713.39517: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853713.39520: when evaluation is False, skipping this task 30583 1726853713.39524: _execute() done 30583 1726853713.39527: dumping result to json 30583 1726853713.39529: done dumping result, returning 30583 1726853713.39539: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000000f1a] 30583 1726853713.39541: sending task result for task 02083763-bbaf-05ea-abc5-000000000f1a 30583 1726853713.39626: done sending task result for task 02083763-bbaf-05ea-abc5-000000000f1a 30583 1726853713.39629: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853713.39687: no more pending results, returning what we have 30583 1726853713.39692: results queue empty 30583 1726853713.39692: checking for any_errors_fatal 30583 1726853713.39701: done checking for any_errors_fatal 30583 1726853713.39702: checking for max_fail_percentage 30583 1726853713.39704: done checking for max_fail_percentage 30583 1726853713.39705: checking to see if all hosts have failed and the running result is not ok 30583 1726853713.39705: done checking to see if all hosts have failed 30583 1726853713.39706: getting the remaining hosts for this loop 30583 1726853713.39708: done getting the remaining hosts for this loop 30583 1726853713.39712: getting the next task for host managed_node2 30583 1726853713.39721: done getting next task for host managed_node2 30583 1726853713.39724: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30583 1726853713.39729: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853713.39734: getting variables 30583 1726853713.39736: in VariableManager get_vars() 30583 1726853713.39772: Calling all_inventory to load vars for managed_node2 30583 1726853713.39775: Calling groups_inventory to load vars for managed_node2 30583 1726853713.39779: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853713.39790: Calling all_plugins_play to load vars for managed_node2 30583 1726853713.39792: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853713.39795: Calling groups_plugins_play to load vars for managed_node2 30583 1726853713.40798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853713.42589: done with get_vars() 30583 1726853713.42614: done getting variables 30583 1726853713.42669: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853713.42754: variable 'profile' from source: play vars 30583 1726853713.42757: variable 'interface' from source: play vars 30583 1726853713.42799: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:35:13 -0400 (0:00:00.046) 0:00:48.765 ****** 30583 1726853713.42824: entering _queue_task() for managed_node2/command 30583 1726853713.43084: worker is 1 (out of 1 available) 30583 1726853713.43098: exiting _queue_task() for managed_node2/command 30583 1726853713.43112: done queuing things up, now waiting for results queue to drain 30583 1726853713.43113: waiting for pending results... 30583 1726853713.43302: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 30583 1726853713.43378: in run() - task 02083763-bbaf-05ea-abc5-000000000f1b 30583 1726853713.43391: variable 'ansible_search_path' from source: unknown 30583 1726853713.43395: variable 'ansible_search_path' from source: unknown 30583 1726853713.43422: calling self._execute() 30583 1726853713.43500: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.43504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.43514: variable 'omit' from source: magic vars 30583 1726853713.43784: variable 'ansible_distribution_major_version' from source: facts 30583 1726853713.43794: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853713.43878: variable 'profile_stat' from source: set_fact 30583 1726853713.43888: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853713.43891: when evaluation is False, skipping this task 30583 1726853713.43894: _execute() done 30583 1726853713.43896: dumping result to json 30583 1726853713.43898: done dumping result, returning 30583 1726853713.43906: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000000f1b] 30583 1726853713.43911: sending task result for task 02083763-bbaf-05ea-abc5-000000000f1b 30583 1726853713.43996: done sending task result for task 02083763-bbaf-05ea-abc5-000000000f1b 30583 1726853713.43998: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853713.44050: no more pending results, returning what we have 30583 1726853713.44053: results queue empty 30583 1726853713.44054: checking for any_errors_fatal 30583 1726853713.44061: done checking for any_errors_fatal 30583 1726853713.44062: checking for max_fail_percentage 30583 1726853713.44064: done checking for max_fail_percentage 30583 1726853713.44065: checking to see if all hosts have failed and the running result is not ok 30583 1726853713.44066: done checking to see if all hosts have failed 30583 1726853713.44066: getting the remaining hosts for this loop 30583 1726853713.44068: done getting the remaining hosts for this loop 30583 1726853713.44073: getting the next task for host managed_node2 30583 1726853713.44082: done getting next task for host managed_node2 30583 1726853713.44084: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30583 1726853713.44089: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853713.44093: getting variables 30583 1726853713.44095: in VariableManager get_vars() 30583 1726853713.44132: Calling all_inventory to load vars for managed_node2 30583 1726853713.44134: Calling groups_inventory to load vars for managed_node2 30583 1726853713.44138: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853713.44149: Calling all_plugins_play to load vars for managed_node2 30583 1726853713.44152: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853713.44154: Calling groups_plugins_play to load vars for managed_node2 30583 1726853713.45550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853713.46774: done with get_vars() 30583 1726853713.46793: done getting variables 30583 1726853713.46838: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853713.46921: variable 'profile' from source: play vars 30583 1726853713.46924: variable 'interface' from source: play vars 30583 1726853713.46963: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:35:13 -0400 (0:00:00.041) 0:00:48.807 ****** 30583 1726853713.46990: entering _queue_task() for managed_node2/set_fact 30583 1726853713.47248: worker is 1 (out of 1 available) 30583 1726853713.47262: exiting _queue_task() for managed_node2/set_fact 30583 1726853713.47279: done queuing things up, now waiting for results queue to drain 30583 1726853713.47280: waiting for pending results... 30583 1726853713.47465: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 30583 1726853713.47550: in run() - task 02083763-bbaf-05ea-abc5-000000000f1c 30583 1726853713.47565: variable 'ansible_search_path' from source: unknown 30583 1726853713.47569: variable 'ansible_search_path' from source: unknown 30583 1726853713.47599: calling self._execute() 30583 1726853713.47672: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.47675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.47685: variable 'omit' from source: magic vars 30583 1726853713.48090: variable 'ansible_distribution_major_version' from source: facts 30583 1726853713.48093: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853713.48124: variable 'profile_stat' from source: set_fact 30583 1726853713.48134: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853713.48136: when evaluation is False, skipping this task 30583 1726853713.48139: _execute() done 30583 1726853713.48142: dumping result to json 30583 1726853713.48144: done dumping result, returning 30583 1726853713.48153: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000000f1c] 30583 1726853713.48160: sending task result for task 02083763-bbaf-05ea-abc5-000000000f1c 30583 1726853713.48258: done sending task result for task 02083763-bbaf-05ea-abc5-000000000f1c 30583 1726853713.48261: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853713.48346: no more pending results, returning what we have 30583 1726853713.48350: results queue empty 30583 1726853713.48351: checking for any_errors_fatal 30583 1726853713.48363: done checking for any_errors_fatal 30583 1726853713.48363: checking for max_fail_percentage 30583 1726853713.48365: done checking for max_fail_percentage 30583 1726853713.48366: checking to see if all hosts have failed and the running result is not ok 30583 1726853713.48367: done checking to see if all hosts have failed 30583 1726853713.48367: getting the remaining hosts for this loop 30583 1726853713.48370: done getting the remaining hosts for this loop 30583 1726853713.48375: getting the next task for host managed_node2 30583 1726853713.48392: done getting next task for host managed_node2 30583 1726853713.48395: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30583 1726853713.48403: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853713.48414: getting variables 30583 1726853713.48416: in VariableManager get_vars() 30583 1726853713.48454: Calling all_inventory to load vars for managed_node2 30583 1726853713.48457: Calling groups_inventory to load vars for managed_node2 30583 1726853713.48461: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853713.48629: Calling all_plugins_play to load vars for managed_node2 30583 1726853713.48635: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853713.48638: Calling groups_plugins_play to load vars for managed_node2 30583 1726853713.50343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853713.51294: done with get_vars() 30583 1726853713.51314: done getting variables 30583 1726853713.51363: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853713.51449: variable 'profile' from source: play vars 30583 1726853713.51452: variable 'interface' from source: play vars 30583 1726853713.51498: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:35:13 -0400 (0:00:00.045) 0:00:48.852 ****** 30583 1726853713.51522: entering _queue_task() for managed_node2/assert 30583 1726853713.51790: worker is 1 (out of 1 available) 30583 1726853713.51805: exiting _queue_task() for managed_node2/assert 30583 1726853713.51818: done queuing things up, now waiting for results queue to drain 30583 1726853713.51819: waiting for pending results... 30583 1726853713.52063: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' 30583 1726853713.52204: in run() - task 02083763-bbaf-05ea-abc5-000000000e8c 30583 1726853713.52209: variable 'ansible_search_path' from source: unknown 30583 1726853713.52219: variable 'ansible_search_path' from source: unknown 30583 1726853713.52223: calling self._execute() 30583 1726853713.52372: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.52376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.52379: variable 'omit' from source: magic vars 30583 1726853713.52702: variable 'ansible_distribution_major_version' from source: facts 30583 1726853713.52723: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853713.52727: variable 'omit' from source: magic vars 30583 1726853713.52798: variable 'omit' from source: magic vars 30583 1726853713.52866: variable 'profile' from source: play vars 30583 1726853713.52869: variable 'interface' from source: play vars 30583 1726853713.52918: variable 'interface' from source: play vars 30583 1726853713.52932: variable 'omit' from source: magic vars 30583 1726853713.52968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853713.53002: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853713.53019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853713.53041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853713.53044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853713.53108: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853713.53111: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.53113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.53219: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853713.53222: Set connection var ansible_timeout to 10 30583 1726853713.53225: Set connection var ansible_connection to ssh 30583 1726853713.53227: Set connection var ansible_shell_executable to /bin/sh 30583 1726853713.53286: Set connection var ansible_shell_type to sh 30583 1726853713.53289: Set connection var ansible_pipelining to False 30583 1726853713.53291: variable 'ansible_shell_executable' from source: unknown 30583 1726853713.53293: variable 'ansible_connection' from source: unknown 30583 1726853713.53295: variable 'ansible_module_compression' from source: unknown 30583 1726853713.53297: variable 'ansible_shell_type' from source: unknown 30583 1726853713.53304: variable 'ansible_shell_executable' from source: unknown 30583 1726853713.53307: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.53309: variable 'ansible_pipelining' from source: unknown 30583 1726853713.53311: variable 'ansible_timeout' from source: unknown 30583 1726853713.53313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.53468: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853713.53504: variable 'omit' from source: magic vars 30583 1726853713.53507: starting attempt loop 30583 1726853713.53510: running the handler 30583 1726853713.53587: variable 'lsr_net_profile_exists' from source: set_fact 30583 1726853713.53590: Evaluated conditional (lsr_net_profile_exists): True 30583 1726853713.53593: handler run complete 30583 1726853713.53595: attempt loop complete, returning result 30583 1726853713.53598: _execute() done 30583 1726853713.53599: dumping result to json 30583 1726853713.53602: done dumping result, returning 30583 1726853713.53604: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' [02083763-bbaf-05ea-abc5-000000000e8c] 30583 1726853713.53622: sending task result for task 02083763-bbaf-05ea-abc5-000000000e8c 30583 1726853713.53724: done sending task result for task 02083763-bbaf-05ea-abc5-000000000e8c 30583 1726853713.53727: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853713.53815: no more pending results, returning what we have 30583 1726853713.53818: results queue empty 30583 1726853713.53819: checking for any_errors_fatal 30583 1726853713.53830: done checking for any_errors_fatal 30583 1726853713.53831: checking for max_fail_percentage 30583 1726853713.53833: done checking for max_fail_percentage 30583 1726853713.53834: checking to see if all hosts have failed and the running result is not ok 30583 1726853713.53835: done checking to see if all hosts have failed 30583 1726853713.53836: getting the remaining hosts for this loop 30583 1726853713.53840: done getting the remaining hosts for this loop 30583 1726853713.53844: getting the next task for host managed_node2 30583 1726853713.53853: done getting next task for host managed_node2 30583 1726853713.53857: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30583 1726853713.53864: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853713.53870: getting variables 30583 1726853713.53906: in VariableManager get_vars() 30583 1726853713.54019: Calling all_inventory to load vars for managed_node2 30583 1726853713.54022: Calling groups_inventory to load vars for managed_node2 30583 1726853713.54025: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853713.54033: Calling all_plugins_play to load vars for managed_node2 30583 1726853713.54035: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853713.54037: Calling groups_plugins_play to load vars for managed_node2 30583 1726853713.59114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853713.60047: done with get_vars() 30583 1726853713.60074: done getting variables 30583 1726853713.60137: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853713.60246: variable 'profile' from source: play vars 30583 1726853713.60251: variable 'interface' from source: play vars 30583 1726853713.60302: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:35:13 -0400 (0:00:00.088) 0:00:48.940 ****** 30583 1726853713.60341: entering _queue_task() for managed_node2/assert 30583 1726853713.60711: worker is 1 (out of 1 available) 30583 1726853713.60726: exiting _queue_task() for managed_node2/assert 30583 1726853713.60739: done queuing things up, now waiting for results queue to drain 30583 1726853713.60740: waiting for pending results... 30583 1726853713.61108: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' 30583 1726853713.61183: in run() - task 02083763-bbaf-05ea-abc5-000000000e8d 30583 1726853713.61210: variable 'ansible_search_path' from source: unknown 30583 1726853713.61213: variable 'ansible_search_path' from source: unknown 30583 1726853713.61283: calling self._execute() 30583 1726853713.61399: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.61406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.61413: variable 'omit' from source: magic vars 30583 1726853713.61779: variable 'ansible_distribution_major_version' from source: facts 30583 1726853713.61788: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853713.61794: variable 'omit' from source: magic vars 30583 1726853713.61842: variable 'omit' from source: magic vars 30583 1726853713.61946: variable 'profile' from source: play vars 30583 1726853713.61949: variable 'interface' from source: play vars 30583 1726853713.62045: variable 'interface' from source: play vars 30583 1726853713.62055: variable 'omit' from source: magic vars 30583 1726853713.62117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853713.62131: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853713.62145: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853713.62161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853713.62177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853713.62198: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853713.62201: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.62204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.62333: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853713.62339: Set connection var ansible_timeout to 10 30583 1726853713.62342: Set connection var ansible_connection to ssh 30583 1726853713.62345: Set connection var ansible_shell_executable to /bin/sh 30583 1726853713.62347: Set connection var ansible_shell_type to sh 30583 1726853713.62350: Set connection var ansible_pipelining to False 30583 1726853713.62378: variable 'ansible_shell_executable' from source: unknown 30583 1726853713.62382: variable 'ansible_connection' from source: unknown 30583 1726853713.62384: variable 'ansible_module_compression' from source: unknown 30583 1726853713.62387: variable 'ansible_shell_type' from source: unknown 30583 1726853713.62389: variable 'ansible_shell_executable' from source: unknown 30583 1726853713.62392: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.62394: variable 'ansible_pipelining' from source: unknown 30583 1726853713.62396: variable 'ansible_timeout' from source: unknown 30583 1726853713.62398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.62503: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853713.62523: variable 'omit' from source: magic vars 30583 1726853713.62529: starting attempt loop 30583 1726853713.62532: running the handler 30583 1726853713.62628: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30583 1726853713.62633: Evaluated conditional (lsr_net_profile_ansible_managed): True 30583 1726853713.62635: handler run complete 30583 1726853713.62648: attempt loop complete, returning result 30583 1726853713.62650: _execute() done 30583 1726853713.62653: dumping result to json 30583 1726853713.62655: done dumping result, returning 30583 1726853713.62664: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' [02083763-bbaf-05ea-abc5-000000000e8d] 30583 1726853713.62669: sending task result for task 02083763-bbaf-05ea-abc5-000000000e8d 30583 1726853713.62756: done sending task result for task 02083763-bbaf-05ea-abc5-000000000e8d 30583 1726853713.62759: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853713.62844: no more pending results, returning what we have 30583 1726853713.62847: results queue empty 30583 1726853713.62848: checking for any_errors_fatal 30583 1726853713.62858: done checking for any_errors_fatal 30583 1726853713.62859: checking for max_fail_percentage 30583 1726853713.62861: done checking for max_fail_percentage 30583 1726853713.62862: checking to see if all hosts have failed and the running result is not ok 30583 1726853713.62863: done checking to see if all hosts have failed 30583 1726853713.62863: getting the remaining hosts for this loop 30583 1726853713.62865: done getting the remaining hosts for this loop 30583 1726853713.62869: getting the next task for host managed_node2 30583 1726853713.62879: done getting next task for host managed_node2 30583 1726853713.62882: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30583 1726853713.62887: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853713.62890: getting variables 30583 1726853713.62892: in VariableManager get_vars() 30583 1726853713.62923: Calling all_inventory to load vars for managed_node2 30583 1726853713.62925: Calling groups_inventory to load vars for managed_node2 30583 1726853713.62929: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853713.62938: Calling all_plugins_play to load vars for managed_node2 30583 1726853713.62941: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853713.62943: Calling groups_plugins_play to load vars for managed_node2 30583 1726853713.63753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853713.64972: done with get_vars() 30583 1726853713.64988: done getting variables 30583 1726853713.65029: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853713.65125: variable 'profile' from source: play vars 30583 1726853713.65128: variable 'interface' from source: play vars 30583 1726853713.65174: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:35:13 -0400 (0:00:00.048) 0:00:48.989 ****** 30583 1726853713.65199: entering _queue_task() for managed_node2/assert 30583 1726853713.65445: worker is 1 (out of 1 available) 30583 1726853713.65461: exiting _queue_task() for managed_node2/assert 30583 1726853713.65477: done queuing things up, now waiting for results queue to drain 30583 1726853713.65478: waiting for pending results... 30583 1726853713.65900: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr 30583 1726853713.66176: in run() - task 02083763-bbaf-05ea-abc5-000000000e8e 30583 1726853713.66180: variable 'ansible_search_path' from source: unknown 30583 1726853713.66183: variable 'ansible_search_path' from source: unknown 30583 1726853713.66186: calling self._execute() 30583 1726853713.66188: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.66196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.66207: variable 'omit' from source: magic vars 30583 1726853713.66747: variable 'ansible_distribution_major_version' from source: facts 30583 1726853713.66786: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853713.66790: variable 'omit' from source: magic vars 30583 1726853713.66846: variable 'omit' from source: magic vars 30583 1726853713.66963: variable 'profile' from source: play vars 30583 1726853713.66967: variable 'interface' from source: play vars 30583 1726853713.67032: variable 'interface' from source: play vars 30583 1726853713.67074: variable 'omit' from source: magic vars 30583 1726853713.67098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853713.67139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853713.67150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853713.67189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853713.67193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853713.67213: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853713.67216: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.67218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.67342: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853713.67346: Set connection var ansible_timeout to 10 30583 1726853713.67348: Set connection var ansible_connection to ssh 30583 1726853713.67356: Set connection var ansible_shell_executable to /bin/sh 30583 1726853713.67358: Set connection var ansible_shell_type to sh 30583 1726853713.67367: Set connection var ansible_pipelining to False 30583 1726853713.67391: variable 'ansible_shell_executable' from source: unknown 30583 1726853713.67415: variable 'ansible_connection' from source: unknown 30583 1726853713.67418: variable 'ansible_module_compression' from source: unknown 30583 1726853713.67421: variable 'ansible_shell_type' from source: unknown 30583 1726853713.67423: variable 'ansible_shell_executable' from source: unknown 30583 1726853713.67425: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.67427: variable 'ansible_pipelining' from source: unknown 30583 1726853713.67436: variable 'ansible_timeout' from source: unknown 30583 1726853713.67438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.67582: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853713.67612: variable 'omit' from source: magic vars 30583 1726853713.67617: starting attempt loop 30583 1726853713.67620: running the handler 30583 1726853713.67713: variable 'lsr_net_profile_fingerprint' from source: set_fact 30583 1726853713.67716: Evaluated conditional (lsr_net_profile_fingerprint): True 30583 1726853713.67722: handler run complete 30583 1726853713.67768: attempt loop complete, returning result 30583 1726853713.67790: _execute() done 30583 1726853713.67799: dumping result to json 30583 1726853713.67802: done dumping result, returning 30583 1726853713.67804: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr [02083763-bbaf-05ea-abc5-000000000e8e] 30583 1726853713.67806: sending task result for task 02083763-bbaf-05ea-abc5-000000000e8e 30583 1726853713.67877: done sending task result for task 02083763-bbaf-05ea-abc5-000000000e8e 30583 1726853713.67883: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853713.68000: no more pending results, returning what we have 30583 1726853713.68003: results queue empty 30583 1726853713.68004: checking for any_errors_fatal 30583 1726853713.68012: done checking for any_errors_fatal 30583 1726853713.68013: checking for max_fail_percentage 30583 1726853713.68015: done checking for max_fail_percentage 30583 1726853713.68015: checking to see if all hosts have failed and the running result is not ok 30583 1726853713.68016: done checking to see if all hosts have failed 30583 1726853713.68017: getting the remaining hosts for this loop 30583 1726853713.68018: done getting the remaining hosts for this loop 30583 1726853713.68024: getting the next task for host managed_node2 30583 1726853713.68035: done getting next task for host managed_node2 30583 1726853713.68041: ^ task is: TASK: Conditional asserts 30583 1726853713.68044: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853713.68047: getting variables 30583 1726853713.68049: in VariableManager get_vars() 30583 1726853713.68081: Calling all_inventory to load vars for managed_node2 30583 1726853713.68084: Calling groups_inventory to load vars for managed_node2 30583 1726853713.68087: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853713.68097: Calling all_plugins_play to load vars for managed_node2 30583 1726853713.68099: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853713.68102: Calling groups_plugins_play to load vars for managed_node2 30583 1726853713.69056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853713.70347: done with get_vars() 30583 1726853713.70375: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 13:35:13 -0400 (0:00:00.052) 0:00:49.041 ****** 30583 1726853713.70468: entering _queue_task() for managed_node2/include_tasks 30583 1726853713.70854: worker is 1 (out of 1 available) 30583 1726853713.70869: exiting _queue_task() for managed_node2/include_tasks 30583 1726853713.70890: done queuing things up, now waiting for results queue to drain 30583 1726853713.70892: waiting for pending results... 30583 1726853713.71208: running TaskExecutor() for managed_node2/TASK: Conditional asserts 30583 1726853713.71286: in run() - task 02083763-bbaf-05ea-abc5-000000000a4f 30583 1726853713.71488: variable 'ansible_search_path' from source: unknown 30583 1726853713.71494: variable 'ansible_search_path' from source: unknown 30583 1726853713.71579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853713.73575: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853713.73623: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853713.73651: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853713.73680: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853713.73700: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853713.73769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853713.73792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853713.73809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853713.73836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853713.73848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853713.73965: dumping result to json 30583 1726853713.73969: done dumping result, returning 30583 1726853713.73975: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [02083763-bbaf-05ea-abc5-000000000a4f] 30583 1726853713.73980: sending task result for task 02083763-bbaf-05ea-abc5-000000000a4f 30583 1726853713.74080: done sending task result for task 02083763-bbaf-05ea-abc5-000000000a4f 30583 1726853713.74083: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } 30583 1726853713.74133: no more pending results, returning what we have 30583 1726853713.74136: results queue empty 30583 1726853713.74137: checking for any_errors_fatal 30583 1726853713.74143: done checking for any_errors_fatal 30583 1726853713.74143: checking for max_fail_percentage 30583 1726853713.74145: done checking for max_fail_percentage 30583 1726853713.74146: checking to see if all hosts have failed and the running result is not ok 30583 1726853713.74147: done checking to see if all hosts have failed 30583 1726853713.74148: getting the remaining hosts for this loop 30583 1726853713.74149: done getting the remaining hosts for this loop 30583 1726853713.74153: getting the next task for host managed_node2 30583 1726853713.74163: done getting next task for host managed_node2 30583 1726853713.74166: ^ task is: TASK: Success in test '{{ lsr_description }}' 30583 1726853713.74168: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853713.74174: getting variables 30583 1726853713.74175: in VariableManager get_vars() 30583 1726853713.74209: Calling all_inventory to load vars for managed_node2 30583 1726853713.74212: Calling groups_inventory to load vars for managed_node2 30583 1726853713.74215: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853713.74225: Calling all_plugins_play to load vars for managed_node2 30583 1726853713.74228: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853713.74230: Calling groups_plugins_play to load vars for managed_node2 30583 1726853713.75402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853713.77177: done with get_vars() 30583 1726853713.77205: done getting variables 30583 1726853713.77270: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853713.77394: variable 'lsr_description' from source: include params TASK [Success in test 'I can activate an existing profile'] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 13:35:13 -0400 (0:00:00.069) 0:00:49.111 ****** 30583 1726853713.77425: entering _queue_task() for managed_node2/debug 30583 1726853713.78204: worker is 1 (out of 1 available) 30583 1726853713.78219: exiting _queue_task() for managed_node2/debug 30583 1726853713.78233: done queuing things up, now waiting for results queue to drain 30583 1726853713.78234: waiting for pending results... 30583 1726853713.78686: running TaskExecutor() for managed_node2/TASK: Success in test 'I can activate an existing profile' 30583 1726853713.78868: in run() - task 02083763-bbaf-05ea-abc5-000000000a50 30583 1726853713.79077: variable 'ansible_search_path' from source: unknown 30583 1726853713.79081: variable 'ansible_search_path' from source: unknown 30583 1726853713.79084: calling self._execute() 30583 1726853713.79220: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.79227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.79312: variable 'omit' from source: magic vars 30583 1726853713.79691: variable 'ansible_distribution_major_version' from source: facts 30583 1726853713.79700: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853713.79707: variable 'omit' from source: magic vars 30583 1726853713.79736: variable 'omit' from source: magic vars 30583 1726853713.79805: variable 'lsr_description' from source: include params 30583 1726853713.79819: variable 'omit' from source: magic vars 30583 1726853713.79853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853713.79886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853713.79901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853713.79915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853713.79924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853713.79947: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853713.79950: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.79953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.80024: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853713.80028: Set connection var ansible_timeout to 10 30583 1726853713.80031: Set connection var ansible_connection to ssh 30583 1726853713.80036: Set connection var ansible_shell_executable to /bin/sh 30583 1726853713.80039: Set connection var ansible_shell_type to sh 30583 1726853713.80046: Set connection var ansible_pipelining to False 30583 1726853713.80065: variable 'ansible_shell_executable' from source: unknown 30583 1726853713.80069: variable 'ansible_connection' from source: unknown 30583 1726853713.80073: variable 'ansible_module_compression' from source: unknown 30583 1726853713.80076: variable 'ansible_shell_type' from source: unknown 30583 1726853713.80078: variable 'ansible_shell_executable' from source: unknown 30583 1726853713.80082: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.80084: variable 'ansible_pipelining' from source: unknown 30583 1726853713.80086: variable 'ansible_timeout' from source: unknown 30583 1726853713.80088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.80188: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853713.80200: variable 'omit' from source: magic vars 30583 1726853713.80203: starting attempt loop 30583 1726853713.80207: running the handler 30583 1726853713.80245: handler run complete 30583 1726853713.80257: attempt loop complete, returning result 30583 1726853713.80261: _execute() done 30583 1726853713.80263: dumping result to json 30583 1726853713.80265: done dumping result, returning 30583 1726853713.80268: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can activate an existing profile' [02083763-bbaf-05ea-abc5-000000000a50] 30583 1726853713.80275: sending task result for task 02083763-bbaf-05ea-abc5-000000000a50 ok: [managed_node2] => {} MSG: +++++ Success in test 'I can activate an existing profile' +++++ 30583 1726853713.80400: no more pending results, returning what we have 30583 1726853713.80403: results queue empty 30583 1726853713.80404: checking for any_errors_fatal 30583 1726853713.80411: done checking for any_errors_fatal 30583 1726853713.80412: checking for max_fail_percentage 30583 1726853713.80414: done checking for max_fail_percentage 30583 1726853713.80415: checking to see if all hosts have failed and the running result is not ok 30583 1726853713.80415: done checking to see if all hosts have failed 30583 1726853713.80416: getting the remaining hosts for this loop 30583 1726853713.80418: done getting the remaining hosts for this loop 30583 1726853713.80421: getting the next task for host managed_node2 30583 1726853713.80431: done getting next task for host managed_node2 30583 1726853713.80434: ^ task is: TASK: Cleanup 30583 1726853713.80438: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853713.80443: getting variables 30583 1726853713.80445: in VariableManager get_vars() 30583 1726853713.80481: Calling all_inventory to load vars for managed_node2 30583 1726853713.80484: Calling groups_inventory to load vars for managed_node2 30583 1726853713.80487: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853713.80497: Calling all_plugins_play to load vars for managed_node2 30583 1726853713.80500: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853713.80503: Calling groups_plugins_play to load vars for managed_node2 30583 1726853713.81084: done sending task result for task 02083763-bbaf-05ea-abc5-000000000a50 30583 1726853713.81088: WORKER PROCESS EXITING 30583 1726853713.81398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853713.82690: done with get_vars() 30583 1726853713.82712: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 13:35:13 -0400 (0:00:00.053) 0:00:49.165 ****** 30583 1726853713.82785: entering _queue_task() for managed_node2/include_tasks 30583 1726853713.83041: worker is 1 (out of 1 available) 30583 1726853713.83057: exiting _queue_task() for managed_node2/include_tasks 30583 1726853713.83072: done queuing things up, now waiting for results queue to drain 30583 1726853713.83075: waiting for pending results... 30583 1726853713.83261: running TaskExecutor() for managed_node2/TASK: Cleanup 30583 1726853713.83334: in run() - task 02083763-bbaf-05ea-abc5-000000000a54 30583 1726853713.83345: variable 'ansible_search_path' from source: unknown 30583 1726853713.83350: variable 'ansible_search_path' from source: unknown 30583 1726853713.83389: variable 'lsr_cleanup' from source: include params 30583 1726853713.83551: variable 'lsr_cleanup' from source: include params 30583 1726853713.83605: variable 'omit' from source: magic vars 30583 1726853713.83710: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.83716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.83726: variable 'omit' from source: magic vars 30583 1726853713.83905: variable 'ansible_distribution_major_version' from source: facts 30583 1726853713.83912: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853713.83919: variable 'item' from source: unknown 30583 1726853713.83964: variable 'item' from source: unknown 30583 1726853713.83991: variable 'item' from source: unknown 30583 1726853713.84032: variable 'item' from source: unknown 30583 1726853713.84146: dumping result to json 30583 1726853713.84149: done dumping result, returning 30583 1726853713.84151: done running TaskExecutor() for managed_node2/TASK: Cleanup [02083763-bbaf-05ea-abc5-000000000a54] 30583 1726853713.84153: sending task result for task 02083763-bbaf-05ea-abc5-000000000a54 30583 1726853713.84191: done sending task result for task 02083763-bbaf-05ea-abc5-000000000a54 30583 1726853713.84195: WORKER PROCESS EXITING 30583 1726853713.84216: no more pending results, returning what we have 30583 1726853713.84221: in VariableManager get_vars() 30583 1726853713.84267: Calling all_inventory to load vars for managed_node2 30583 1726853713.84272: Calling groups_inventory to load vars for managed_node2 30583 1726853713.84276: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853713.84292: Calling all_plugins_play to load vars for managed_node2 30583 1726853713.84296: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853713.84298: Calling groups_plugins_play to load vars for managed_node2 30583 1726853713.85816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853713.87706: done with get_vars() 30583 1726853713.87732: variable 'ansible_search_path' from source: unknown 30583 1726853713.87734: variable 'ansible_search_path' from source: unknown 30583 1726853713.87864: we have included files to process 30583 1726853713.87866: generating all_blocks data 30583 1726853713.87869: done generating all_blocks data 30583 1726853713.87875: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853713.87876: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853713.88075: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853713.88401: done processing included file 30583 1726853713.88403: iterating over new_blocks loaded from include file 30583 1726853713.88404: in VariableManager get_vars() 30583 1726853713.88421: done with get_vars() 30583 1726853713.88424: filtering new block on tags 30583 1726853713.88451: done filtering new block on tags 30583 1726853713.88454: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 30583 1726853713.88462: extending task lists for all hosts with included blocks 30583 1726853713.90435: done extending task lists 30583 1726853713.90437: done processing included files 30583 1726853713.90438: results queue empty 30583 1726853713.90439: checking for any_errors_fatal 30583 1726853713.90442: done checking for any_errors_fatal 30583 1726853713.90443: checking for max_fail_percentage 30583 1726853713.90444: done checking for max_fail_percentage 30583 1726853713.90445: checking to see if all hosts have failed and the running result is not ok 30583 1726853713.90446: done checking to see if all hosts have failed 30583 1726853713.90446: getting the remaining hosts for this loop 30583 1726853713.90448: done getting the remaining hosts for this loop 30583 1726853713.90458: getting the next task for host managed_node2 30583 1726853713.90464: done getting next task for host managed_node2 30583 1726853713.90466: ^ task is: TASK: Cleanup profile and device 30583 1726853713.90470: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853713.90474: getting variables 30583 1726853713.90475: in VariableManager get_vars() 30583 1726853713.90489: Calling all_inventory to load vars for managed_node2 30583 1726853713.90492: Calling groups_inventory to load vars for managed_node2 30583 1726853713.90495: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853713.90501: Calling all_plugins_play to load vars for managed_node2 30583 1726853713.90503: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853713.90506: Calling groups_plugins_play to load vars for managed_node2 30583 1726853713.91904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853713.94336: done with get_vars() 30583 1726853713.94363: done getting variables 30583 1726853713.94449: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 13:35:13 -0400 (0:00:00.116) 0:00:49.283 ****** 30583 1726853713.94601: entering _queue_task() for managed_node2/shell 30583 1726853713.95106: worker is 1 (out of 1 available) 30583 1726853713.95118: exiting _queue_task() for managed_node2/shell 30583 1726853713.95254: done queuing things up, now waiting for results queue to drain 30583 1726853713.95258: waiting for pending results... 30583 1726853713.95504: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 30583 1726853713.95689: in run() - task 02083763-bbaf-05ea-abc5-000000000f6d 30583 1726853713.95694: variable 'ansible_search_path' from source: unknown 30583 1726853713.95701: variable 'ansible_search_path' from source: unknown 30583 1726853713.95729: calling self._execute() 30583 1726853713.95908: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.95913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.95916: variable 'omit' from source: magic vars 30583 1726853713.96283: variable 'ansible_distribution_major_version' from source: facts 30583 1726853713.96306: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853713.96319: variable 'omit' from source: magic vars 30583 1726853713.96388: variable 'omit' from source: magic vars 30583 1726853713.96558: variable 'interface' from source: play vars 30583 1726853713.96591: variable 'omit' from source: magic vars 30583 1726853713.96649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853713.96707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853713.96737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853713.96763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853713.96889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853713.96893: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853713.96895: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.96898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.96948: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853713.96961: Set connection var ansible_timeout to 10 30583 1726853713.96969: Set connection var ansible_connection to ssh 30583 1726853713.96983: Set connection var ansible_shell_executable to /bin/sh 30583 1726853713.97000: Set connection var ansible_shell_type to sh 30583 1726853713.97023: Set connection var ansible_pipelining to False 30583 1726853713.97076: variable 'ansible_shell_executable' from source: unknown 30583 1726853713.97130: variable 'ansible_connection' from source: unknown 30583 1726853713.97147: variable 'ansible_module_compression' from source: unknown 30583 1726853713.97161: variable 'ansible_shell_type' from source: unknown 30583 1726853713.97174: variable 'ansible_shell_executable' from source: unknown 30583 1726853713.97186: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853713.97197: variable 'ansible_pipelining' from source: unknown 30583 1726853713.97226: variable 'ansible_timeout' from source: unknown 30583 1726853713.97248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853713.97646: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853713.97676: variable 'omit' from source: magic vars 30583 1726853713.97692: starting attempt loop 30583 1726853713.97707: running the handler 30583 1726853713.97786: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853713.97853: _low_level_execute_command(): starting 30583 1726853713.97976: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853713.99129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853713.99160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853713.99191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853713.99382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853713.99565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853714.01503: stdout chunk (state=3): >>>/root <<< 30583 1726853714.01508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853714.01510: stderr chunk (state=3): >>><<< 30583 1726853714.01763: stdout chunk (state=3): >>><<< 30583 1726853714.01768: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853714.01774: _low_level_execute_command(): starting 30583 1726853714.01778: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994 `" && echo ansible-tmp-1726853714.0171454-32938-133310858433994="` echo /root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994 `" ) && sleep 0' 30583 1726853714.02948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853714.03084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853714.03577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853714.03583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853714.05640: stdout chunk (state=3): >>>ansible-tmp-1726853714.0171454-32938-133310858433994=/root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994 <<< 30583 1726853714.05775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853714.05792: stderr chunk (state=3): >>><<< 30583 1726853714.05892: stdout chunk (state=3): >>><<< 30583 1726853714.05915: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853714.0171454-32938-133310858433994=/root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853714.06091: variable 'ansible_module_compression' from source: unknown 30583 1726853714.06102: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853714.06105: variable 'ansible_facts' from source: unknown 30583 1726853714.06333: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994/AnsiballZ_command.py 30583 1726853714.06819: Sending initial data 30583 1726853714.06822: Sent initial data (156 bytes) 30583 1726853714.07783: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853714.08083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853714.08110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853714.08192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853714.09900: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853714.09952: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853714.10023: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpwk86h6t8 /root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994/AnsiballZ_command.py <<< 30583 1726853714.10027: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994/AnsiballZ_command.py" <<< 30583 1726853714.10101: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpwk86h6t8" to remote "/root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994/AnsiballZ_command.py" <<< 30583 1726853714.11673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853714.11907: stderr chunk (state=3): >>><<< 30583 1726853714.11910: stdout chunk (state=3): >>><<< 30583 1726853714.11941: done transferring module to remote 30583 1726853714.11953: _low_level_execute_command(): starting 30583 1726853714.11959: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994/ /root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994/AnsiballZ_command.py && sleep 0' 30583 1726853714.13059: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853714.13326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853714.13329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853714.13332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853714.13334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853714.13337: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853714.13351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853714.13377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853714.13452: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853714.13731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853714.13787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853714.15744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853714.15748: stdout chunk (state=3): >>><<< 30583 1726853714.15755: stderr chunk (state=3): >>><<< 30583 1726853714.15780: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853714.15783: _low_level_execute_command(): starting 30583 1726853714.15786: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994/AnsiballZ_command.py && sleep 0' 30583 1726853714.17095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853714.17099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853714.17101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853714.17286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853714.17424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853714.17746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853714.38528: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (a240f7a0-666a-4048-8567-0de2206b9c72) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 13:35:14.334212", "end": "2024-09-20 13:35:14.382994", "delta": "0:00:00.048782", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853714.41663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853714.41667: stdout chunk (state=3): >>><<< 30583 1726853714.41669: stderr chunk (state=3): >>><<< 30583 1726853714.41829: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Connection 'statebr' (a240f7a0-666a-4048-8567-0de2206b9c72) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 13:35:14.334212", "end": "2024-09-20 13:35:14.382994", "delta": "0:00:00.048782", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853714.41834: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853714.41838: _low_level_execute_command(): starting 30583 1726853714.41840: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853714.0171454-32938-133310858433994/ > /dev/null 2>&1 && sleep 0' 30583 1726853714.43218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853714.43251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853714.43324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853714.45478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853714.45482: stderr chunk (state=3): >>><<< 30583 1726853714.45678: stdout chunk (state=3): >>><<< 30583 1726853714.45682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853714.45684: handler run complete 30583 1726853714.45687: Evaluated conditional (False): False 30583 1726853714.45689: attempt loop complete, returning result 30583 1726853714.45691: _execute() done 30583 1726853714.45693: dumping result to json 30583 1726853714.45695: done dumping result, returning 30583 1726853714.45697: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [02083763-bbaf-05ea-abc5-000000000f6d] 30583 1726853714.45699: sending task result for task 02083763-bbaf-05ea-abc5-000000000f6d 30583 1726853714.45875: done sending task result for task 02083763-bbaf-05ea-abc5-000000000f6d ok: [managed_node2] => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.048782", "end": "2024-09-20 13:35:14.382994", "rc": 0, "start": "2024-09-20 13:35:14.334212" } STDOUT: Connection 'statebr' (a240f7a0-666a-4048-8567-0de2206b9c72) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' 30583 1726853714.45949: no more pending results, returning what we have 30583 1726853714.45957: results queue empty 30583 1726853714.45958: checking for any_errors_fatal 30583 1726853714.45960: done checking for any_errors_fatal 30583 1726853714.45960: checking for max_fail_percentage 30583 1726853714.45963: done checking for max_fail_percentage 30583 1726853714.45964: checking to see if all hosts have failed and the running result is not ok 30583 1726853714.45964: done checking to see if all hosts have failed 30583 1726853714.45965: getting the remaining hosts for this loop 30583 1726853714.45967: done getting the remaining hosts for this loop 30583 1726853714.45973: getting the next task for host managed_node2 30583 1726853714.45984: done getting next task for host managed_node2 30583 1726853714.45988: ^ task is: TASK: Include the task 'run_test.yml' 30583 1726853714.45990: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853714.45994: getting variables 30583 1726853714.45995: in VariableManager get_vars() 30583 1726853714.46030: Calling all_inventory to load vars for managed_node2 30583 1726853714.46032: Calling groups_inventory to load vars for managed_node2 30583 1726853714.46035: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853714.46046: Calling all_plugins_play to load vars for managed_node2 30583 1726853714.46048: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853714.46051: Calling groups_plugins_play to load vars for managed_node2 30583 1726853714.46785: WORKER PROCESS EXITING 30583 1726853714.49478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853714.54902: done with get_vars() 30583 1726853714.54933: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:83 Friday 20 September 2024 13:35:14 -0400 (0:00:00.604) 0:00:49.887 ****** 30583 1726853714.55173: entering _queue_task() for managed_node2/include_tasks 30583 1726853714.55949: worker is 1 (out of 1 available) 30583 1726853714.55963: exiting _queue_task() for managed_node2/include_tasks 30583 1726853714.56031: done queuing things up, now waiting for results queue to drain 30583 1726853714.56033: waiting for pending results... 30583 1726853714.56407: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 30583 1726853714.56691: in run() - task 02083763-bbaf-05ea-abc5-000000000013 30583 1726853714.56700: variable 'ansible_search_path' from source: unknown 30583 1726853714.56740: calling self._execute() 30583 1726853714.56841: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.56845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.56860: variable 'omit' from source: magic vars 30583 1726853714.57806: variable 'ansible_distribution_major_version' from source: facts 30583 1726853714.57809: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853714.57931: _execute() done 30583 1726853714.57935: dumping result to json 30583 1726853714.57943: done dumping result, returning 30583 1726853714.57946: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [02083763-bbaf-05ea-abc5-000000000013] 30583 1726853714.57949: sending task result for task 02083763-bbaf-05ea-abc5-000000000013 30583 1726853714.58403: no more pending results, returning what we have 30583 1726853714.58421: in VariableManager get_vars() 30583 1726853714.58473: Calling all_inventory to load vars for managed_node2 30583 1726853714.58476: Calling groups_inventory to load vars for managed_node2 30583 1726853714.58480: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853714.58494: Calling all_plugins_play to load vars for managed_node2 30583 1726853714.58497: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853714.58499: Calling groups_plugins_play to load vars for managed_node2 30583 1726853714.59131: done sending task result for task 02083763-bbaf-05ea-abc5-000000000013 30583 1726853714.59135: WORKER PROCESS EXITING 30583 1726853714.62400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853714.66425: done with get_vars() 30583 1726853714.66452: variable 'ansible_search_path' from source: unknown 30583 1726853714.66477: we have included files to process 30583 1726853714.66479: generating all_blocks data 30583 1726853714.66480: done generating all_blocks data 30583 1726853714.66485: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853714.66486: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853714.66489: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853714.66988: in VariableManager get_vars() 30583 1726853714.67097: done with get_vars() 30583 1726853714.67156: in VariableManager get_vars() 30583 1726853714.67179: done with get_vars() 30583 1726853714.67220: in VariableManager get_vars() 30583 1726853714.67244: done with get_vars() 30583 1726853714.67288: in VariableManager get_vars() 30583 1726853714.67362: done with get_vars() 30583 1726853714.67423: in VariableManager get_vars() 30583 1726853714.67602: done with get_vars() 30583 1726853714.68478: in VariableManager get_vars() 30583 1726853714.68495: done with get_vars() 30583 1726853714.68507: done processing included file 30583 1726853714.68509: iterating over new_blocks loaded from include file 30583 1726853714.68510: in VariableManager get_vars() 30583 1726853714.68521: done with get_vars() 30583 1726853714.68523: filtering new block on tags 30583 1726853714.68636: done filtering new block on tags 30583 1726853714.68639: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 30583 1726853714.68780: extending task lists for all hosts with included blocks 30583 1726853714.68816: done extending task lists 30583 1726853714.68817: done processing included files 30583 1726853714.68818: results queue empty 30583 1726853714.68818: checking for any_errors_fatal 30583 1726853714.68823: done checking for any_errors_fatal 30583 1726853714.68824: checking for max_fail_percentage 30583 1726853714.68825: done checking for max_fail_percentage 30583 1726853714.68826: checking to see if all hosts have failed and the running result is not ok 30583 1726853714.68826: done checking to see if all hosts have failed 30583 1726853714.68827: getting the remaining hosts for this loop 30583 1726853714.68828: done getting the remaining hosts for this loop 30583 1726853714.68831: getting the next task for host managed_node2 30583 1726853714.68835: done getting next task for host managed_node2 30583 1726853714.68837: ^ task is: TASK: TEST: {{ lsr_description }} 30583 1726853714.68839: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853714.68841: getting variables 30583 1726853714.68842: in VariableManager get_vars() 30583 1726853714.68851: Calling all_inventory to load vars for managed_node2 30583 1726853714.68853: Calling groups_inventory to load vars for managed_node2 30583 1726853714.68858: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853714.68864: Calling all_plugins_play to load vars for managed_node2 30583 1726853714.68867: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853714.68869: Calling groups_plugins_play to load vars for managed_node2 30583 1726853714.72020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853714.73641: done with get_vars() 30583 1726853714.73669: done getting variables 30583 1726853714.73814: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853714.74022: variable 'lsr_description' from source: include params TASK [TEST: I can remove an existing profile without taking it down] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 13:35:14 -0400 (0:00:00.190) 0:00:50.077 ****** 30583 1726853714.74062: entering _queue_task() for managed_node2/debug 30583 1726853714.74911: worker is 1 (out of 1 available) 30583 1726853714.74925: exiting _queue_task() for managed_node2/debug 30583 1726853714.74938: done queuing things up, now waiting for results queue to drain 30583 1726853714.74939: waiting for pending results... 30583 1726853714.75818: running TaskExecutor() for managed_node2/TASK: TEST: I can remove an existing profile without taking it down 30583 1726853714.75822: in run() - task 02083763-bbaf-05ea-abc5-000000001005 30583 1726853714.75859: variable 'ansible_search_path' from source: unknown 30583 1726853714.75868: variable 'ansible_search_path' from source: unknown 30583 1726853714.75918: calling self._execute() 30583 1726853714.76184: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.76198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.76214: variable 'omit' from source: magic vars 30583 1726853714.76977: variable 'ansible_distribution_major_version' from source: facts 30583 1726853714.77276: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853714.77280: variable 'omit' from source: magic vars 30583 1726853714.77283: variable 'omit' from source: magic vars 30583 1726853714.77286: variable 'lsr_description' from source: include params 30583 1726853714.77288: variable 'omit' from source: magic vars 30583 1726853714.77517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853714.77559: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853714.77776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853714.77779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853714.77782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853714.77785: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853714.77787: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.77790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.78176: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853714.78179: Set connection var ansible_timeout to 10 30583 1726853714.78182: Set connection var ansible_connection to ssh 30583 1726853714.78185: Set connection var ansible_shell_executable to /bin/sh 30583 1726853714.78187: Set connection var ansible_shell_type to sh 30583 1726853714.78189: Set connection var ansible_pipelining to False 30583 1726853714.78191: variable 'ansible_shell_executable' from source: unknown 30583 1726853714.78193: variable 'ansible_connection' from source: unknown 30583 1726853714.78196: variable 'ansible_module_compression' from source: unknown 30583 1726853714.78197: variable 'ansible_shell_type' from source: unknown 30583 1726853714.78199: variable 'ansible_shell_executable' from source: unknown 30583 1726853714.78201: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.78203: variable 'ansible_pipelining' from source: unknown 30583 1726853714.78205: variable 'ansible_timeout' from source: unknown 30583 1726853714.78207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.78477: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853714.78495: variable 'omit' from source: magic vars 30583 1726853714.78505: starting attempt loop 30583 1726853714.78512: running the handler 30583 1726853714.78560: handler run complete 30583 1726853714.78976: attempt loop complete, returning result 30583 1726853714.78980: _execute() done 30583 1726853714.78982: dumping result to json 30583 1726853714.78985: done dumping result, returning 30583 1726853714.78987: done running TaskExecutor() for managed_node2/TASK: TEST: I can remove an existing profile without taking it down [02083763-bbaf-05ea-abc5-000000001005] 30583 1726853714.78989: sending task result for task 02083763-bbaf-05ea-abc5-000000001005 30583 1726853714.79063: done sending task result for task 02083763-bbaf-05ea-abc5-000000001005 30583 1726853714.79067: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I can remove an existing profile without taking it down ########## 30583 1726853714.79129: no more pending results, returning what we have 30583 1726853714.79133: results queue empty 30583 1726853714.79135: checking for any_errors_fatal 30583 1726853714.79136: done checking for any_errors_fatal 30583 1726853714.79137: checking for max_fail_percentage 30583 1726853714.79139: done checking for max_fail_percentage 30583 1726853714.79140: checking to see if all hosts have failed and the running result is not ok 30583 1726853714.79141: done checking to see if all hosts have failed 30583 1726853714.79141: getting the remaining hosts for this loop 30583 1726853714.79144: done getting the remaining hosts for this loop 30583 1726853714.79148: getting the next task for host managed_node2 30583 1726853714.79160: done getting next task for host managed_node2 30583 1726853714.79163: ^ task is: TASK: Show item 30583 1726853714.79166: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853714.79173: getting variables 30583 1726853714.79175: in VariableManager get_vars() 30583 1726853714.79213: Calling all_inventory to load vars for managed_node2 30583 1726853714.79216: Calling groups_inventory to load vars for managed_node2 30583 1726853714.79220: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853714.79232: Calling all_plugins_play to load vars for managed_node2 30583 1726853714.79235: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853714.79238: Calling groups_plugins_play to load vars for managed_node2 30583 1726853714.82566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853714.87386: done with get_vars() 30583 1726853714.87425: done getting variables 30583 1726853714.87598: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 13:35:14 -0400 (0:00:00.135) 0:00:50.213 ****** 30583 1726853714.87628: entering _queue_task() for managed_node2/debug 30583 1726853714.88695: worker is 1 (out of 1 available) 30583 1726853714.88710: exiting _queue_task() for managed_node2/debug 30583 1726853714.88724: done queuing things up, now waiting for results queue to drain 30583 1726853714.88726: waiting for pending results... 30583 1726853714.89161: running TaskExecutor() for managed_node2/TASK: Show item 30583 1726853714.89487: in run() - task 02083763-bbaf-05ea-abc5-000000001006 30583 1726853714.89510: variable 'ansible_search_path' from source: unknown 30583 1726853714.89517: variable 'ansible_search_path' from source: unknown 30583 1726853714.89643: variable 'omit' from source: magic vars 30583 1726853714.90019: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.90036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.90079: variable 'omit' from source: magic vars 30583 1726853714.90878: variable 'ansible_distribution_major_version' from source: facts 30583 1726853714.90900: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853714.90902: variable 'omit' from source: magic vars 30583 1726853714.90905: variable 'omit' from source: magic vars 30583 1726853714.91145: variable 'item' from source: unknown 30583 1726853714.91311: variable 'item' from source: unknown 30583 1726853714.91439: variable 'omit' from source: magic vars 30583 1726853714.91499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853714.91583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853714.91670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853714.91714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853714.91735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853714.91810: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853714.91918: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.91928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.92159: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853714.92376: Set connection var ansible_timeout to 10 30583 1726853714.92381: Set connection var ansible_connection to ssh 30583 1726853714.92383: Set connection var ansible_shell_executable to /bin/sh 30583 1726853714.92385: Set connection var ansible_shell_type to sh 30583 1726853714.92387: Set connection var ansible_pipelining to False 30583 1726853714.92390: variable 'ansible_shell_executable' from source: unknown 30583 1726853714.92392: variable 'ansible_connection' from source: unknown 30583 1726853714.92394: variable 'ansible_module_compression' from source: unknown 30583 1726853714.92396: variable 'ansible_shell_type' from source: unknown 30583 1726853714.92404: variable 'ansible_shell_executable' from source: unknown 30583 1726853714.92406: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.92408: variable 'ansible_pipelining' from source: unknown 30583 1726853714.92410: variable 'ansible_timeout' from source: unknown 30583 1726853714.92512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.92693: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853714.92948: variable 'omit' from source: magic vars 30583 1726853714.92951: starting attempt loop 30583 1726853714.92954: running the handler 30583 1726853714.92963: variable 'lsr_description' from source: include params 30583 1726853714.93193: variable 'lsr_description' from source: include params 30583 1726853714.93210: handler run complete 30583 1726853714.93249: attempt loop complete, returning result 30583 1726853714.93283: variable 'item' from source: unknown 30583 1726853714.93466: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can remove an existing profile without taking it down" } 30583 1726853714.93776: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.93780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.93783: variable 'omit' from source: magic vars 30583 1726853714.93938: variable 'ansible_distribution_major_version' from source: facts 30583 1726853714.93948: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853714.93956: variable 'omit' from source: magic vars 30583 1726853714.93983: variable 'omit' from source: magic vars 30583 1726853714.94025: variable 'item' from source: unknown 30583 1726853714.94144: variable 'item' from source: unknown 30583 1726853714.94148: variable 'omit' from source: magic vars 30583 1726853714.94150: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853714.94152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853714.94155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853714.94164: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853714.94172: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.94253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.94256: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853714.94264: Set connection var ansible_timeout to 10 30583 1726853714.94272: Set connection var ansible_connection to ssh 30583 1726853714.94281: Set connection var ansible_shell_executable to /bin/sh 30583 1726853714.94287: Set connection var ansible_shell_type to sh 30583 1726853714.94299: Set connection var ansible_pipelining to False 30583 1726853714.94321: variable 'ansible_shell_executable' from source: unknown 30583 1726853714.94328: variable 'ansible_connection' from source: unknown 30583 1726853714.94335: variable 'ansible_module_compression' from source: unknown 30583 1726853714.94341: variable 'ansible_shell_type' from source: unknown 30583 1726853714.94347: variable 'ansible_shell_executable' from source: unknown 30583 1726853714.94353: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.94364: variable 'ansible_pipelining' from source: unknown 30583 1726853714.94372: variable 'ansible_timeout' from source: unknown 30583 1726853714.94379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.94460: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853714.94479: variable 'omit' from source: magic vars 30583 1726853714.94488: starting attempt loop 30583 1726853714.94493: running the handler 30583 1726853714.94517: variable 'lsr_setup' from source: include params 30583 1726853714.94675: variable 'lsr_setup' from source: include params 30583 1726853714.94678: handler run complete 30583 1726853714.94682: attempt loop complete, returning result 30583 1726853714.94684: variable 'item' from source: unknown 30583 1726853714.94738: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml" ] } 30583 1726853714.94943: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.94948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.94950: variable 'omit' from source: magic vars 30583 1726853714.95076: variable 'ansible_distribution_major_version' from source: facts 30583 1726853714.95087: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853714.95095: variable 'omit' from source: magic vars 30583 1726853714.95112: variable 'omit' from source: magic vars 30583 1726853714.95161: variable 'item' from source: unknown 30583 1726853714.95252: variable 'item' from source: unknown 30583 1726853714.95329: variable 'omit' from source: magic vars 30583 1726853714.95336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853714.95338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853714.95418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853714.95423: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853714.95488: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.95648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.95687: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853714.95700: Set connection var ansible_timeout to 10 30583 1726853714.95707: Set connection var ansible_connection to ssh 30583 1726853714.95717: Set connection var ansible_shell_executable to /bin/sh 30583 1726853714.95724: Set connection var ansible_shell_type to sh 30583 1726853714.95738: Set connection var ansible_pipelining to False 30583 1726853714.96078: variable 'ansible_shell_executable' from source: unknown 30583 1726853714.96081: variable 'ansible_connection' from source: unknown 30583 1726853714.96084: variable 'ansible_module_compression' from source: unknown 30583 1726853714.96087: variable 'ansible_shell_type' from source: unknown 30583 1726853714.96090: variable 'ansible_shell_executable' from source: unknown 30583 1726853714.96092: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.96095: variable 'ansible_pipelining' from source: unknown 30583 1726853714.96098: variable 'ansible_timeout' from source: unknown 30583 1726853714.96100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.96143: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853714.96155: variable 'omit' from source: magic vars 30583 1726853714.96163: starting attempt loop 30583 1726853714.96169: running the handler 30583 1726853714.96197: variable 'lsr_test' from source: include params 30583 1726853714.96266: variable 'lsr_test' from source: include params 30583 1726853714.96399: handler run complete 30583 1726853714.96419: attempt loop complete, returning result 30583 1726853714.96515: variable 'item' from source: unknown 30583 1726853714.96545: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove_profile.yml" ] } 30583 1726853714.96877: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.96881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.96884: variable 'omit' from source: magic vars 30583 1726853714.97228: variable 'ansible_distribution_major_version' from source: facts 30583 1726853714.97239: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853714.97246: variable 'omit' from source: magic vars 30583 1726853714.97262: variable 'omit' from source: magic vars 30583 1726853714.97307: variable 'item' from source: unknown 30583 1726853714.97389: variable 'item' from source: unknown 30583 1726853714.97392: variable 'omit' from source: magic vars 30583 1726853714.97444: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853714.97466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853714.97498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853714.97513: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853714.97521: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.97529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.97606: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853714.97675: Set connection var ansible_timeout to 10 30583 1726853714.97678: Set connection var ansible_connection to ssh 30583 1726853714.97680: Set connection var ansible_shell_executable to /bin/sh 30583 1726853714.97682: Set connection var ansible_shell_type to sh 30583 1726853714.97684: Set connection var ansible_pipelining to False 30583 1726853714.97686: variable 'ansible_shell_executable' from source: unknown 30583 1726853714.97688: variable 'ansible_connection' from source: unknown 30583 1726853714.97690: variable 'ansible_module_compression' from source: unknown 30583 1726853714.97692: variable 'ansible_shell_type' from source: unknown 30583 1726853714.97694: variable 'ansible_shell_executable' from source: unknown 30583 1726853714.97696: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.97698: variable 'ansible_pipelining' from source: unknown 30583 1726853714.97700: variable 'ansible_timeout' from source: unknown 30583 1726853714.97718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.97819: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853714.97833: variable 'omit' from source: magic vars 30583 1726853714.97843: starting attempt loop 30583 1726853714.97849: running the handler 30583 1726853714.97876: variable 'lsr_assert' from source: include params 30583 1726853714.98039: variable 'lsr_assert' from source: include params 30583 1726853714.98042: handler run complete 30583 1726853714.98045: attempt loop complete, returning result 30583 1726853714.98047: variable 'item' from source: unknown 30583 1726853714.98074: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_absent.yml" ] } 30583 1726853714.98256: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.98260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.98263: variable 'omit' from source: magic vars 30583 1726853714.98578: variable 'ansible_distribution_major_version' from source: facts 30583 1726853714.98585: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853714.98588: variable 'omit' from source: magic vars 30583 1726853714.98590: variable 'omit' from source: magic vars 30583 1726853714.98593: variable 'item' from source: unknown 30583 1726853714.98629: variable 'item' from source: unknown 30583 1726853714.98650: variable 'omit' from source: magic vars 30583 1726853714.98677: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853714.98695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853714.98705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853714.98723: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853714.98740: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.98743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.98829: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853714.98840: Set connection var ansible_timeout to 10 30583 1726853714.98847: Set connection var ansible_connection to ssh 30583 1726853714.98860: Set connection var ansible_shell_executable to /bin/sh 30583 1726853714.98868: Set connection var ansible_shell_type to sh 30583 1726853714.98889: Set connection var ansible_pipelining to False 30583 1726853714.99015: variable 'ansible_shell_executable' from source: unknown 30583 1726853714.99018: variable 'ansible_connection' from source: unknown 30583 1726853714.99020: variable 'ansible_module_compression' from source: unknown 30583 1726853714.99023: variable 'ansible_shell_type' from source: unknown 30583 1726853714.99025: variable 'ansible_shell_executable' from source: unknown 30583 1726853714.99027: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.99029: variable 'ansible_pipelining' from source: unknown 30583 1726853714.99031: variable 'ansible_timeout' from source: unknown 30583 1726853714.99033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.99057: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853714.99076: variable 'omit' from source: magic vars 30583 1726853714.99088: starting attempt loop 30583 1726853714.99095: running the handler 30583 1726853714.99213: handler run complete 30583 1726853714.99236: attempt loop complete, returning result 30583 1726853714.99255: variable 'item' from source: unknown 30583 1726853714.99321: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30583 1726853714.99897: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853714.99900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853714.99902: variable 'omit' from source: magic vars 30583 1726853715.00003: variable 'ansible_distribution_major_version' from source: facts 30583 1726853715.00014: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853715.00024: variable 'omit' from source: magic vars 30583 1726853715.00044: variable 'omit' from source: magic vars 30583 1726853715.00250: variable 'item' from source: unknown 30583 1726853715.00377: variable 'item' from source: unknown 30583 1726853715.00466: variable 'omit' from source: magic vars 30583 1726853715.00470: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853715.00474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853715.00477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853715.00479: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853715.00481: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853715.00483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853715.00630: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853715.00645: Set connection var ansible_timeout to 10 30583 1726853715.00693: Set connection var ansible_connection to ssh 30583 1726853715.00707: Set connection var ansible_shell_executable to /bin/sh 30583 1726853715.00718: Set connection var ansible_shell_type to sh 30583 1726853715.00735: Set connection var ansible_pipelining to False 30583 1726853715.00821: variable 'ansible_shell_executable' from source: unknown 30583 1726853715.00829: variable 'ansible_connection' from source: unknown 30583 1726853715.00839: variable 'ansible_module_compression' from source: unknown 30583 1726853715.00852: variable 'ansible_shell_type' from source: unknown 30583 1726853715.00861: variable 'ansible_shell_executable' from source: unknown 30583 1726853715.00876: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853715.01012: variable 'ansible_pipelining' from source: unknown 30583 1726853715.01016: variable 'ansible_timeout' from source: unknown 30583 1726853715.01018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853715.01338: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853715.01345: variable 'omit' from source: magic vars 30583 1726853715.01347: starting attempt loop 30583 1726853715.01351: running the handler 30583 1726853715.01353: variable 'lsr_fail_debug' from source: play vars 30583 1726853715.01355: variable 'lsr_fail_debug' from source: play vars 30583 1726853715.01401: handler run complete 30583 1726853715.01465: attempt loop complete, returning result 30583 1726853715.01490: variable 'item' from source: unknown 30583 1726853715.01613: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30583 1726853715.02096: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853715.02100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853715.02103: variable 'omit' from source: magic vars 30583 1726853715.02161: variable 'ansible_distribution_major_version' from source: facts 30583 1726853715.02311: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853715.02315: variable 'omit' from source: magic vars 30583 1726853715.02317: variable 'omit' from source: magic vars 30583 1726853715.02353: variable 'item' from source: unknown 30583 1726853715.02578: variable 'item' from source: unknown 30583 1726853715.02581: variable 'omit' from source: magic vars 30583 1726853715.02583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853715.02585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853715.02591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853715.02604: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853715.02611: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853715.02618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853715.02745: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853715.02803: Set connection var ansible_timeout to 10 30583 1726853715.03079: Set connection var ansible_connection to ssh 30583 1726853715.03082: Set connection var ansible_shell_executable to /bin/sh 30583 1726853715.03084: Set connection var ansible_shell_type to sh 30583 1726853715.03086: Set connection var ansible_pipelining to False 30583 1726853715.03087: variable 'ansible_shell_executable' from source: unknown 30583 1726853715.03089: variable 'ansible_connection' from source: unknown 30583 1726853715.03091: variable 'ansible_module_compression' from source: unknown 30583 1726853715.03092: variable 'ansible_shell_type' from source: unknown 30583 1726853715.03094: variable 'ansible_shell_executable' from source: unknown 30583 1726853715.03095: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853715.03098: variable 'ansible_pipelining' from source: unknown 30583 1726853715.03101: variable 'ansible_timeout' from source: unknown 30583 1726853715.03104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853715.03254: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853715.03267: variable 'omit' from source: magic vars 30583 1726853715.03278: starting attempt loop 30583 1726853715.03323: running the handler 30583 1726853715.03346: variable 'lsr_cleanup' from source: include params 30583 1726853715.03467: variable 'lsr_cleanup' from source: include params 30583 1726853715.03554: handler run complete 30583 1726853715.03573: attempt loop complete, returning result 30583 1726853715.03592: variable 'item' from source: unknown 30583 1726853715.03772: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30583 1726853715.03937: dumping result to json 30583 1726853715.03939: done dumping result, returning 30583 1726853715.03941: done running TaskExecutor() for managed_node2/TASK: Show item [02083763-bbaf-05ea-abc5-000000001006] 30583 1726853715.03943: sending task result for task 02083763-bbaf-05ea-abc5-000000001006 30583 1726853715.03998: done sending task result for task 02083763-bbaf-05ea-abc5-000000001006 30583 1726853715.04003: WORKER PROCESS EXITING 30583 1726853715.04059: no more pending results, returning what we have 30583 1726853715.04063: results queue empty 30583 1726853715.04064: checking for any_errors_fatal 30583 1726853715.04074: done checking for any_errors_fatal 30583 1726853715.04075: checking for max_fail_percentage 30583 1726853715.04078: done checking for max_fail_percentage 30583 1726853715.04079: checking to see if all hosts have failed and the running result is not ok 30583 1726853715.04080: done checking to see if all hosts have failed 30583 1726853715.04080: getting the remaining hosts for this loop 30583 1726853715.04082: done getting the remaining hosts for this loop 30583 1726853715.04086: getting the next task for host managed_node2 30583 1726853715.04093: done getting next task for host managed_node2 30583 1726853715.04096: ^ task is: TASK: Include the task 'show_interfaces.yml' 30583 1726853715.04099: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853715.04102: getting variables 30583 1726853715.04104: in VariableManager get_vars() 30583 1726853715.04137: Calling all_inventory to load vars for managed_node2 30583 1726853715.04140: Calling groups_inventory to load vars for managed_node2 30583 1726853715.04143: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853715.04153: Calling all_plugins_play to load vars for managed_node2 30583 1726853715.04158: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853715.04161: Calling groups_plugins_play to load vars for managed_node2 30583 1726853715.07849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853715.11275: done with get_vars() 30583 1726853715.11308: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 13:35:15 -0400 (0:00:00.238) 0:00:50.452 ****** 30583 1726853715.11522: entering _queue_task() for managed_node2/include_tasks 30583 1726853715.12368: worker is 1 (out of 1 available) 30583 1726853715.12383: exiting _queue_task() for managed_node2/include_tasks 30583 1726853715.12395: done queuing things up, now waiting for results queue to drain 30583 1726853715.12397: waiting for pending results... 30583 1726853715.12991: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 30583 1726853715.13232: in run() - task 02083763-bbaf-05ea-abc5-000000001007 30583 1726853715.13236: variable 'ansible_search_path' from source: unknown 30583 1726853715.13240: variable 'ansible_search_path' from source: unknown 30583 1726853715.13242: calling self._execute() 30583 1726853715.13583: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853715.13588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853715.13591: variable 'omit' from source: magic vars 30583 1726853715.14250: variable 'ansible_distribution_major_version' from source: facts 30583 1726853715.14262: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853715.14268: _execute() done 30583 1726853715.14273: dumping result to json 30583 1726853715.14276: done dumping result, returning 30583 1726853715.14283: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-05ea-abc5-000000001007] 30583 1726853715.14288: sending task result for task 02083763-bbaf-05ea-abc5-000000001007 30583 1726853715.14614: no more pending results, returning what we have 30583 1726853715.14621: in VariableManager get_vars() 30583 1726853715.14666: Calling all_inventory to load vars for managed_node2 30583 1726853715.14669: Calling groups_inventory to load vars for managed_node2 30583 1726853715.14675: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853715.14690: Calling all_plugins_play to load vars for managed_node2 30583 1726853715.14694: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853715.14698: Calling groups_plugins_play to load vars for managed_node2 30583 1726853715.15216: done sending task result for task 02083763-bbaf-05ea-abc5-000000001007 30583 1726853715.15220: WORKER PROCESS EXITING 30583 1726853715.17652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853715.20844: done with get_vars() 30583 1726853715.20873: variable 'ansible_search_path' from source: unknown 30583 1726853715.20875: variable 'ansible_search_path' from source: unknown 30583 1726853715.21033: we have included files to process 30583 1726853715.21035: generating all_blocks data 30583 1726853715.21037: done generating all_blocks data 30583 1726853715.21043: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853715.21044: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853715.21047: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853715.21311: in VariableManager get_vars() 30583 1726853715.21331: done with get_vars() 30583 1726853715.21603: done processing included file 30583 1726853715.21605: iterating over new_blocks loaded from include file 30583 1726853715.21607: in VariableManager get_vars() 30583 1726853715.21621: done with get_vars() 30583 1726853715.21622: filtering new block on tags 30583 1726853715.21654: done filtering new block on tags 30583 1726853715.21656: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 30583 1726853715.21777: extending task lists for all hosts with included blocks 30583 1726853715.22756: done extending task lists 30583 1726853715.22758: done processing included files 30583 1726853715.22759: results queue empty 30583 1726853715.22759: checking for any_errors_fatal 30583 1726853715.22768: done checking for any_errors_fatal 30583 1726853715.22769: checking for max_fail_percentage 30583 1726853715.22770: done checking for max_fail_percentage 30583 1726853715.22772: checking to see if all hosts have failed and the running result is not ok 30583 1726853715.22773: done checking to see if all hosts have failed 30583 1726853715.22774: getting the remaining hosts for this loop 30583 1726853715.22775: done getting the remaining hosts for this loop 30583 1726853715.22779: getting the next task for host managed_node2 30583 1726853715.22783: done getting next task for host managed_node2 30583 1726853715.22785: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30583 1726853715.22788: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853715.22791: getting variables 30583 1726853715.22792: in VariableManager get_vars() 30583 1726853715.22804: Calling all_inventory to load vars for managed_node2 30583 1726853715.22806: Calling groups_inventory to load vars for managed_node2 30583 1726853715.22808: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853715.22814: Calling all_plugins_play to load vars for managed_node2 30583 1726853715.22816: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853715.22819: Calling groups_plugins_play to load vars for managed_node2 30583 1726853715.25512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853715.29009: done with get_vars() 30583 1726853715.29047: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:35:15 -0400 (0:00:00.177) 0:00:50.629 ****** 30583 1726853715.29262: entering _queue_task() for managed_node2/include_tasks 30583 1726853715.30311: worker is 1 (out of 1 available) 30583 1726853715.30323: exiting _queue_task() for managed_node2/include_tasks 30583 1726853715.30334: done queuing things up, now waiting for results queue to drain 30583 1726853715.30335: waiting for pending results... 30583 1726853715.30666: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 30583 1726853715.31008: in run() - task 02083763-bbaf-05ea-abc5-00000000102e 30583 1726853715.31098: variable 'ansible_search_path' from source: unknown 30583 1726853715.31103: variable 'ansible_search_path' from source: unknown 30583 1726853715.31112: calling self._execute() 30583 1726853715.31258: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853715.31262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853715.31269: variable 'omit' from source: magic vars 30583 1726853715.32117: variable 'ansible_distribution_major_version' from source: facts 30583 1726853715.32122: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853715.32277: _execute() done 30583 1726853715.32281: dumping result to json 30583 1726853715.32283: done dumping result, returning 30583 1726853715.32286: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-05ea-abc5-00000000102e] 30583 1726853715.32288: sending task result for task 02083763-bbaf-05ea-abc5-00000000102e 30583 1726853715.32361: done sending task result for task 02083763-bbaf-05ea-abc5-00000000102e 30583 1726853715.32365: WORKER PROCESS EXITING 30583 1726853715.32402: no more pending results, returning what we have 30583 1726853715.32409: in VariableManager get_vars() 30583 1726853715.32456: Calling all_inventory to load vars for managed_node2 30583 1726853715.32459: Calling groups_inventory to load vars for managed_node2 30583 1726853715.32463: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853715.32480: Calling all_plugins_play to load vars for managed_node2 30583 1726853715.32484: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853715.32488: Calling groups_plugins_play to load vars for managed_node2 30583 1726853715.35743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853715.38989: done with get_vars() 30583 1726853715.39131: variable 'ansible_search_path' from source: unknown 30583 1726853715.39133: variable 'ansible_search_path' from source: unknown 30583 1726853715.39327: we have included files to process 30583 1726853715.39328: generating all_blocks data 30583 1726853715.39330: done generating all_blocks data 30583 1726853715.39331: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853715.39333: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853715.39336: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853715.39856: done processing included file 30583 1726853715.39858: iterating over new_blocks loaded from include file 30583 1726853715.39860: in VariableManager get_vars() 30583 1726853715.39985: done with get_vars() 30583 1726853715.39988: filtering new block on tags 30583 1726853715.40027: done filtering new block on tags 30583 1726853715.40029: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 30583 1726853715.40035: extending task lists for all hosts with included blocks 30583 1726853715.40404: done extending task lists 30583 1726853715.40406: done processing included files 30583 1726853715.40407: results queue empty 30583 1726853715.40407: checking for any_errors_fatal 30583 1726853715.40411: done checking for any_errors_fatal 30583 1726853715.40413: checking for max_fail_percentage 30583 1726853715.40415: done checking for max_fail_percentage 30583 1726853715.40415: checking to see if all hosts have failed and the running result is not ok 30583 1726853715.40416: done checking to see if all hosts have failed 30583 1726853715.40417: getting the remaining hosts for this loop 30583 1726853715.40418: done getting the remaining hosts for this loop 30583 1726853715.40421: getting the next task for host managed_node2 30583 1726853715.40426: done getting next task for host managed_node2 30583 1726853715.40428: ^ task is: TASK: Gather current interface info 30583 1726853715.40431: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853715.40434: getting variables 30583 1726853715.40435: in VariableManager get_vars() 30583 1726853715.40445: Calling all_inventory to load vars for managed_node2 30583 1726853715.40447: Calling groups_inventory to load vars for managed_node2 30583 1726853715.40449: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853715.40455: Calling all_plugins_play to load vars for managed_node2 30583 1726853715.40457: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853715.40464: Calling groups_plugins_play to load vars for managed_node2 30583 1726853715.43226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853715.46873: done with get_vars() 30583 1726853715.46912: done getting variables 30583 1726853715.46959: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:35:15 -0400 (0:00:00.180) 0:00:50.810 ****** 30583 1726853715.47299: entering _queue_task() for managed_node2/command 30583 1726853715.48222: worker is 1 (out of 1 available) 30583 1726853715.48234: exiting _queue_task() for managed_node2/command 30583 1726853715.48246: done queuing things up, now waiting for results queue to drain 30583 1726853715.48248: waiting for pending results... 30583 1726853715.48963: running TaskExecutor() for managed_node2/TASK: Gather current interface info 30583 1726853715.49076: in run() - task 02083763-bbaf-05ea-abc5-000000001069 30583 1726853715.49104: variable 'ansible_search_path' from source: unknown 30583 1726853715.49277: variable 'ansible_search_path' from source: unknown 30583 1726853715.49282: calling self._execute() 30583 1726853715.49423: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853715.49466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853715.49484: variable 'omit' from source: magic vars 30583 1726853715.50541: variable 'ansible_distribution_major_version' from source: facts 30583 1726853715.50544: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853715.50547: variable 'omit' from source: magic vars 30583 1726853715.50681: variable 'omit' from source: magic vars 30583 1726853715.51016: variable 'omit' from source: magic vars 30583 1726853715.51028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853715.51070: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853715.51374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853715.51878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853715.51881: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853715.51883: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853715.51885: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853715.51887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853715.51889: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853715.51891: Set connection var ansible_timeout to 10 30583 1726853715.51893: Set connection var ansible_connection to ssh 30583 1726853715.51895: Set connection var ansible_shell_executable to /bin/sh 30583 1726853715.51897: Set connection var ansible_shell_type to sh 30583 1726853715.51898: Set connection var ansible_pipelining to False 30583 1726853715.52387: variable 'ansible_shell_executable' from source: unknown 30583 1726853715.52390: variable 'ansible_connection' from source: unknown 30583 1726853715.52393: variable 'ansible_module_compression' from source: unknown 30583 1726853715.52396: variable 'ansible_shell_type' from source: unknown 30583 1726853715.52398: variable 'ansible_shell_executable' from source: unknown 30583 1726853715.52400: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853715.52402: variable 'ansible_pipelining' from source: unknown 30583 1726853715.52404: variable 'ansible_timeout' from source: unknown 30583 1726853715.52407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853715.52847: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853715.52864: variable 'omit' from source: magic vars 30583 1726853715.52876: starting attempt loop 30583 1726853715.52883: running the handler 30583 1726853715.52945: _low_level_execute_command(): starting 30583 1726853715.52960: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853715.54996: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853715.55360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853715.55386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853715.55416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853715.55581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853715.57364: stdout chunk (state=3): >>>/root <<< 30583 1726853715.57482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853715.58008: stderr chunk (state=3): >>><<< 30583 1726853715.58012: stdout chunk (state=3): >>><<< 30583 1726853715.58016: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853715.58020: _low_level_execute_command(): starting 30583 1726853715.58022: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622 `" && echo ansible-tmp-1726853715.5791702-32989-98194172233622="` echo /root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622 `" ) && sleep 0' 30583 1726853715.59156: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853715.59161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853715.59166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853715.59173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853715.59396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853715.59411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853715.59619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853715.59715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853715.61758: stdout chunk (state=3): >>>ansible-tmp-1726853715.5791702-32989-98194172233622=/root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622 <<< 30583 1726853715.61916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853715.61920: stdout chunk (state=3): >>><<< 30583 1726853715.61927: stderr chunk (state=3): >>><<< 30583 1726853715.61956: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853715.5791702-32989-98194172233622=/root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853715.61994: variable 'ansible_module_compression' from source: unknown 30583 1726853715.62049: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853715.62291: variable 'ansible_facts' from source: unknown 30583 1726853715.62418: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622/AnsiballZ_command.py 30583 1726853715.62756: Sending initial data 30583 1726853715.62759: Sent initial data (155 bytes) 30583 1726853715.63336: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853715.63345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853715.63387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853715.63404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853715.63479: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853715.63487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853715.63591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853715.65314: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30583 1726853715.65338: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30583 1726853715.65422: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30583 1726853715.65425: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 30583 1726853715.65451: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853715.65552: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853715.65636: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp_91xdbxo /root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622/AnsiballZ_command.py <<< 30583 1726853715.65640: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622/AnsiballZ_command.py" <<< 30583 1726853715.65865: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp_91xdbxo" to remote "/root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622/AnsiballZ_command.py" <<< 30583 1726853715.67030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853715.67034: stdout chunk (state=3): >>><<< 30583 1726853715.67041: stderr chunk (state=3): >>><<< 30583 1726853715.67276: done transferring module to remote 30583 1726853715.67280: _low_level_execute_command(): starting 30583 1726853715.67283: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622/ /root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622/AnsiballZ_command.py && sleep 0' 30583 1726853715.67610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853715.67615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853715.67646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853715.67649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853715.67651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853715.67653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853715.67719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853715.67735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853715.67796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853715.69780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853715.69784: stdout chunk (state=3): >>><<< 30583 1726853715.69786: stderr chunk (state=3): >>><<< 30583 1726853715.69808: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853715.69901: _low_level_execute_command(): starting 30583 1726853715.69905: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622/AnsiballZ_command.py && sleep 0' 30583 1726853715.70502: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853715.70636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853715.70673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853715.70807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853715.86726: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:35:15.862474", "end": "2024-09-20 13:35:15.866019", "delta": "0:00:00.003545", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853715.88879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853715.88883: stdout chunk (state=3): >>><<< 30583 1726853715.88886: stderr chunk (state=3): >>><<< 30583 1726853715.88889: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:35:15.862474", "end": "2024-09-20 13:35:15.866019", "delta": "0:00:00.003545", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853715.88892: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853715.88895: _low_level_execute_command(): starting 30583 1726853715.88897: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853715.5791702-32989-98194172233622/ > /dev/null 2>&1 && sleep 0' 30583 1726853715.89957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853715.90044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853715.90401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853715.92335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853715.92430: stderr chunk (state=3): >>><<< 30583 1726853715.92434: stdout chunk (state=3): >>><<< 30583 1726853715.92438: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853715.92455: handler run complete 30583 1726853715.92459: Evaluated conditional (False): False 30583 1726853715.92462: attempt loop complete, returning result 30583 1726853715.92464: _execute() done 30583 1726853715.92466: dumping result to json 30583 1726853715.92468: done dumping result, returning 30583 1726853715.92538: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [02083763-bbaf-05ea-abc5-000000001069] 30583 1726853715.92544: sending task result for task 02083763-bbaf-05ea-abc5-000000001069 ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003545", "end": "2024-09-20 13:35:15.866019", "rc": 0, "start": "2024-09-20 13:35:15.862474" } STDOUT: bonding_masters eth0 lo 30583 1726853715.92722: no more pending results, returning what we have 30583 1726853715.92727: results queue empty 30583 1726853715.92728: checking for any_errors_fatal 30583 1726853715.92730: done checking for any_errors_fatal 30583 1726853715.92730: checking for max_fail_percentage 30583 1726853715.92733: done checking for max_fail_percentage 30583 1726853715.92734: checking to see if all hosts have failed and the running result is not ok 30583 1726853715.92735: done checking to see if all hosts have failed 30583 1726853715.92735: getting the remaining hosts for this loop 30583 1726853715.92738: done getting the remaining hosts for this loop 30583 1726853715.92742: getting the next task for host managed_node2 30583 1726853715.92756: done getting next task for host managed_node2 30583 1726853715.92759: ^ task is: TASK: Set current_interfaces 30583 1726853715.92765: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853715.92983: getting variables 30583 1726853715.92986: in VariableManager get_vars() 30583 1726853715.93030: Calling all_inventory to load vars for managed_node2 30583 1726853715.93035: Calling groups_inventory to load vars for managed_node2 30583 1726853715.93039: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853715.93049: done sending task result for task 02083763-bbaf-05ea-abc5-000000001069 30583 1726853715.93052: WORKER PROCESS EXITING 30583 1726853715.93062: Calling all_plugins_play to load vars for managed_node2 30583 1726853715.93066: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853715.93069: Calling groups_plugins_play to load vars for managed_node2 30583 1726853715.94349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853715.95755: done with get_vars() 30583 1726853715.95782: done getting variables 30583 1726853715.95860: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:35:15 -0400 (0:00:00.485) 0:00:51.296 ****** 30583 1726853715.95897: entering _queue_task() for managed_node2/set_fact 30583 1726853715.96408: worker is 1 (out of 1 available) 30583 1726853715.96425: exiting _queue_task() for managed_node2/set_fact 30583 1726853715.96438: done queuing things up, now waiting for results queue to drain 30583 1726853715.96440: waiting for pending results... 30583 1726853715.96702: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 30583 1726853715.96851: in run() - task 02083763-bbaf-05ea-abc5-00000000106a 30583 1726853715.96890: variable 'ansible_search_path' from source: unknown 30583 1726853715.97075: variable 'ansible_search_path' from source: unknown 30583 1726853715.97079: calling self._execute() 30583 1726853715.97344: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853715.97347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853715.97351: variable 'omit' from source: magic vars 30583 1726853715.97693: variable 'ansible_distribution_major_version' from source: facts 30583 1726853715.97711: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853715.97724: variable 'omit' from source: magic vars 30583 1726853715.97904: variable 'omit' from source: magic vars 30583 1726853715.98056: variable '_current_interfaces' from source: set_fact 30583 1726853715.98325: variable 'omit' from source: magic vars 30583 1726853715.98435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853715.98528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853715.98577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853715.98606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853715.98679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853715.98713: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853715.98722: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853715.98732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853715.99074: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853715.99099: Set connection var ansible_timeout to 10 30583 1726853715.99154: Set connection var ansible_connection to ssh 30583 1726853715.99166: Set connection var ansible_shell_executable to /bin/sh 30583 1726853715.99182: Set connection var ansible_shell_type to sh 30583 1726853715.99198: Set connection var ansible_pipelining to False 30583 1726853715.99226: variable 'ansible_shell_executable' from source: unknown 30583 1726853715.99234: variable 'ansible_connection' from source: unknown 30583 1726853715.99240: variable 'ansible_module_compression' from source: unknown 30583 1726853715.99245: variable 'ansible_shell_type' from source: unknown 30583 1726853715.99250: variable 'ansible_shell_executable' from source: unknown 30583 1726853715.99256: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853715.99262: variable 'ansible_pipelining' from source: unknown 30583 1726853715.99267: variable 'ansible_timeout' from source: unknown 30583 1726853715.99277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853715.99444: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853715.99462: variable 'omit' from source: magic vars 30583 1726853715.99474: starting attempt loop 30583 1726853715.99569: running the handler 30583 1726853715.99574: handler run complete 30583 1726853715.99576: attempt loop complete, returning result 30583 1726853715.99578: _execute() done 30583 1726853715.99581: dumping result to json 30583 1726853715.99583: done dumping result, returning 30583 1726853715.99586: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [02083763-bbaf-05ea-abc5-00000000106a] 30583 1726853715.99588: sending task result for task 02083763-bbaf-05ea-abc5-00000000106a 30583 1726853715.99661: done sending task result for task 02083763-bbaf-05ea-abc5-00000000106a 30583 1726853715.99665: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30583 1726853715.99733: no more pending results, returning what we have 30583 1726853715.99736: results queue empty 30583 1726853715.99738: checking for any_errors_fatal 30583 1726853715.99749: done checking for any_errors_fatal 30583 1726853715.99750: checking for max_fail_percentage 30583 1726853715.99753: done checking for max_fail_percentage 30583 1726853715.99754: checking to see if all hosts have failed and the running result is not ok 30583 1726853715.99755: done checking to see if all hosts have failed 30583 1726853715.99755: getting the remaining hosts for this loop 30583 1726853715.99760: done getting the remaining hosts for this loop 30583 1726853715.99766: getting the next task for host managed_node2 30583 1726853715.99777: done getting next task for host managed_node2 30583 1726853715.99780: ^ task is: TASK: Show current_interfaces 30583 1726853715.99784: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853715.99788: getting variables 30583 1726853715.99790: in VariableManager get_vars() 30583 1726853715.99831: Calling all_inventory to load vars for managed_node2 30583 1726853715.99834: Calling groups_inventory to load vars for managed_node2 30583 1726853715.99838: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853715.99848: Calling all_plugins_play to load vars for managed_node2 30583 1726853715.99851: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853715.99854: Calling groups_plugins_play to load vars for managed_node2 30583 1726853716.00932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853716.02164: done with get_vars() 30583 1726853716.02209: done getting variables 30583 1726853716.02288: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:35:16 -0400 (0:00:00.064) 0:00:51.360 ****** 30583 1726853716.02324: entering _queue_task() for managed_node2/debug 30583 1726853716.02732: worker is 1 (out of 1 available) 30583 1726853716.02745: exiting _queue_task() for managed_node2/debug 30583 1726853716.02757: done queuing things up, now waiting for results queue to drain 30583 1726853716.02760: waiting for pending results... 30583 1726853716.03222: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 30583 1726853716.03229: in run() - task 02083763-bbaf-05ea-abc5-00000000102f 30583 1726853716.03234: variable 'ansible_search_path' from source: unknown 30583 1726853716.03237: variable 'ansible_search_path' from source: unknown 30583 1726853716.03242: calling self._execute() 30583 1726853716.03334: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853716.03342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853716.03347: variable 'omit' from source: magic vars 30583 1726853716.03734: variable 'ansible_distribution_major_version' from source: facts 30583 1726853716.03746: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853716.03855: variable 'omit' from source: magic vars 30583 1726853716.03861: variable 'omit' from source: magic vars 30583 1726853716.04068: variable 'current_interfaces' from source: set_fact 30583 1726853716.04074: variable 'omit' from source: magic vars 30583 1726853716.04077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853716.04079: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853716.04081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853716.04083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853716.04085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853716.04088: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853716.04090: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853716.04093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853716.04219: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853716.04232: Set connection var ansible_timeout to 10 30583 1726853716.04236: Set connection var ansible_connection to ssh 30583 1726853716.04244: Set connection var ansible_shell_executable to /bin/sh 30583 1726853716.04246: Set connection var ansible_shell_type to sh 30583 1726853716.04249: Set connection var ansible_pipelining to False 30583 1726853716.04291: variable 'ansible_shell_executable' from source: unknown 30583 1726853716.04295: variable 'ansible_connection' from source: unknown 30583 1726853716.04297: variable 'ansible_module_compression' from source: unknown 30583 1726853716.04300: variable 'ansible_shell_type' from source: unknown 30583 1726853716.04302: variable 'ansible_shell_executable' from source: unknown 30583 1726853716.04304: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853716.04306: variable 'ansible_pipelining' from source: unknown 30583 1726853716.04308: variable 'ansible_timeout' from source: unknown 30583 1726853716.04315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853716.04897: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853716.04901: variable 'omit' from source: magic vars 30583 1726853716.04904: starting attempt loop 30583 1726853716.04907: running the handler 30583 1726853716.04996: handler run complete 30583 1726853716.05009: attempt loop complete, returning result 30583 1726853716.05012: _execute() done 30583 1726853716.05015: dumping result to json 30583 1726853716.05022: done dumping result, returning 30583 1726853716.05032: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [02083763-bbaf-05ea-abc5-00000000102f] 30583 1726853716.05036: sending task result for task 02083763-bbaf-05ea-abc5-00000000102f 30583 1726853716.05291: done sending task result for task 02083763-bbaf-05ea-abc5-00000000102f 30583 1726853716.05294: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30583 1726853716.05432: no more pending results, returning what we have 30583 1726853716.05438: results queue empty 30583 1726853716.05439: checking for any_errors_fatal 30583 1726853716.05446: done checking for any_errors_fatal 30583 1726853716.05447: checking for max_fail_percentage 30583 1726853716.05449: done checking for max_fail_percentage 30583 1726853716.05450: checking to see if all hosts have failed and the running result is not ok 30583 1726853716.05451: done checking to see if all hosts have failed 30583 1726853716.05451: getting the remaining hosts for this loop 30583 1726853716.05453: done getting the remaining hosts for this loop 30583 1726853716.05458: getting the next task for host managed_node2 30583 1726853716.05469: done getting next task for host managed_node2 30583 1726853716.05476: ^ task is: TASK: Setup 30583 1726853716.05482: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853716.05487: getting variables 30583 1726853716.05489: in VariableManager get_vars() 30583 1726853716.05528: Calling all_inventory to load vars for managed_node2 30583 1726853716.05533: Calling groups_inventory to load vars for managed_node2 30583 1726853716.05541: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853716.05554: Calling all_plugins_play to load vars for managed_node2 30583 1726853716.05558: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853716.05561: Calling groups_plugins_play to load vars for managed_node2 30583 1726853716.13964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853716.15173: done with get_vars() 30583 1726853716.15204: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 13:35:16 -0400 (0:00:00.129) 0:00:51.490 ****** 30583 1726853716.15293: entering _queue_task() for managed_node2/include_tasks 30583 1726853716.15630: worker is 1 (out of 1 available) 30583 1726853716.15644: exiting _queue_task() for managed_node2/include_tasks 30583 1726853716.15660: done queuing things up, now waiting for results queue to drain 30583 1726853716.15662: waiting for pending results... 30583 1726853716.15863: running TaskExecutor() for managed_node2/TASK: Setup 30583 1726853716.15944: in run() - task 02083763-bbaf-05ea-abc5-000000001008 30583 1726853716.15954: variable 'ansible_search_path' from source: unknown 30583 1726853716.15967: variable 'ansible_search_path' from source: unknown 30583 1726853716.16004: variable 'lsr_setup' from source: include params 30583 1726853716.16183: variable 'lsr_setup' from source: include params 30583 1726853716.16239: variable 'omit' from source: magic vars 30583 1726853716.16343: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853716.16352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853716.16362: variable 'omit' from source: magic vars 30583 1726853716.16538: variable 'ansible_distribution_major_version' from source: facts 30583 1726853716.16546: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853716.16554: variable 'item' from source: unknown 30583 1726853716.16601: variable 'item' from source: unknown 30583 1726853716.16628: variable 'item' from source: unknown 30583 1726853716.16675: variable 'item' from source: unknown 30583 1726853716.16799: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853716.16803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853716.16807: variable 'omit' from source: magic vars 30583 1726853716.16885: variable 'ansible_distribution_major_version' from source: facts 30583 1726853716.16889: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853716.16895: variable 'item' from source: unknown 30583 1726853716.16941: variable 'item' from source: unknown 30583 1726853716.16965: variable 'item' from source: unknown 30583 1726853716.17008: variable 'item' from source: unknown 30583 1726853716.17082: dumping result to json 30583 1726853716.17085: done dumping result, returning 30583 1726853716.17087: done running TaskExecutor() for managed_node2/TASK: Setup [02083763-bbaf-05ea-abc5-000000001008] 30583 1726853716.17089: sending task result for task 02083763-bbaf-05ea-abc5-000000001008 30583 1726853716.17128: done sending task result for task 02083763-bbaf-05ea-abc5-000000001008 30583 1726853716.17130: WORKER PROCESS EXITING 30583 1726853716.17158: no more pending results, returning what we have 30583 1726853716.17164: in VariableManager get_vars() 30583 1726853716.17202: Calling all_inventory to load vars for managed_node2 30583 1726853716.17205: Calling groups_inventory to load vars for managed_node2 30583 1726853716.17208: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853716.17221: Calling all_plugins_play to load vars for managed_node2 30583 1726853716.17224: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853716.17227: Calling groups_plugins_play to load vars for managed_node2 30583 1726853716.18589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853716.19624: done with get_vars() 30583 1726853716.19640: variable 'ansible_search_path' from source: unknown 30583 1726853716.19641: variable 'ansible_search_path' from source: unknown 30583 1726853716.19677: variable 'ansible_search_path' from source: unknown 30583 1726853716.19678: variable 'ansible_search_path' from source: unknown 30583 1726853716.19696: we have included files to process 30583 1726853716.19697: generating all_blocks data 30583 1726853716.19698: done generating all_blocks data 30583 1726853716.19701: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853716.19702: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853716.19704: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853716.19872: done processing included file 30583 1726853716.19874: iterating over new_blocks loaded from include file 30583 1726853716.19876: in VariableManager get_vars() 30583 1726853716.19888: done with get_vars() 30583 1726853716.19890: filtering new block on tags 30583 1726853716.19913: done filtering new block on tags 30583 1726853716.19914: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 30583 1726853716.19918: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30583 1726853716.19918: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30583 1726853716.19920: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30583 1726853716.19980: done processing included file 30583 1726853716.19982: iterating over new_blocks loaded from include file 30583 1726853716.19983: in VariableManager get_vars() 30583 1726853716.19994: done with get_vars() 30583 1726853716.19996: filtering new block on tags 30583 1726853716.20009: done filtering new block on tags 30583 1726853716.20010: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node2 => (item=tasks/activate_profile.yml) 30583 1726853716.20013: extending task lists for all hosts with included blocks 30583 1726853716.20368: done extending task lists 30583 1726853716.20369: done processing included files 30583 1726853716.20370: results queue empty 30583 1726853716.20370: checking for any_errors_fatal 30583 1726853716.20374: done checking for any_errors_fatal 30583 1726853716.20375: checking for max_fail_percentage 30583 1726853716.20376: done checking for max_fail_percentage 30583 1726853716.20376: checking to see if all hosts have failed and the running result is not ok 30583 1726853716.20377: done checking to see if all hosts have failed 30583 1726853716.20377: getting the remaining hosts for this loop 30583 1726853716.20378: done getting the remaining hosts for this loop 30583 1726853716.20380: getting the next task for host managed_node2 30583 1726853716.20383: done getting next task for host managed_node2 30583 1726853716.20384: ^ task is: TASK: Include network role 30583 1726853716.20386: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853716.20387: getting variables 30583 1726853716.20388: in VariableManager get_vars() 30583 1726853716.20394: Calling all_inventory to load vars for managed_node2 30583 1726853716.20401: Calling groups_inventory to load vars for managed_node2 30583 1726853716.20402: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853716.20406: Calling all_plugins_play to load vars for managed_node2 30583 1726853716.20408: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853716.20410: Calling groups_plugins_play to load vars for managed_node2 30583 1726853716.21484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853716.22832: done with get_vars() 30583 1726853716.22854: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 13:35:16 -0400 (0:00:00.076) 0:00:51.566 ****** 30583 1726853716.22914: entering _queue_task() for managed_node2/include_role 30583 1726853716.23185: worker is 1 (out of 1 available) 30583 1726853716.23197: exiting _queue_task() for managed_node2/include_role 30583 1726853716.23213: done queuing things up, now waiting for results queue to drain 30583 1726853716.23214: waiting for pending results... 30583 1726853716.23415: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853716.23505: in run() - task 02083763-bbaf-05ea-abc5-00000000108f 30583 1726853716.23517: variable 'ansible_search_path' from source: unknown 30583 1726853716.23521: variable 'ansible_search_path' from source: unknown 30583 1726853716.23550: calling self._execute() 30583 1726853716.23628: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853716.23632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853716.23642: variable 'omit' from source: magic vars 30583 1726853716.23935: variable 'ansible_distribution_major_version' from source: facts 30583 1726853716.23949: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853716.23953: _execute() done 30583 1726853716.23956: dumping result to json 30583 1726853716.23959: done dumping result, returning 30583 1726853716.23965: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-00000000108f] 30583 1726853716.23969: sending task result for task 02083763-bbaf-05ea-abc5-00000000108f 30583 1726853716.24088: done sending task result for task 02083763-bbaf-05ea-abc5-00000000108f 30583 1726853716.24091: WORKER PROCESS EXITING 30583 1726853716.24118: no more pending results, returning what we have 30583 1726853716.24122: in VariableManager get_vars() 30583 1726853716.24161: Calling all_inventory to load vars for managed_node2 30583 1726853716.24165: Calling groups_inventory to load vars for managed_node2 30583 1726853716.24168: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853716.24181: Calling all_plugins_play to load vars for managed_node2 30583 1726853716.24184: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853716.24187: Calling groups_plugins_play to load vars for managed_node2 30583 1726853716.25441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853716.26476: done with get_vars() 30583 1726853716.26491: variable 'ansible_search_path' from source: unknown 30583 1726853716.26492: variable 'ansible_search_path' from source: unknown 30583 1726853716.26616: variable 'omit' from source: magic vars 30583 1726853716.26646: variable 'omit' from source: magic vars 30583 1726853716.26655: variable 'omit' from source: magic vars 30583 1726853716.26660: we have included files to process 30583 1726853716.26661: generating all_blocks data 30583 1726853716.26662: done generating all_blocks data 30583 1726853716.26663: processing included file: fedora.linux_system_roles.network 30583 1726853716.26679: in VariableManager get_vars() 30583 1726853716.26690: done with get_vars() 30583 1726853716.26710: in VariableManager get_vars() 30583 1726853716.26720: done with get_vars() 30583 1726853716.26749: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853716.26824: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853716.26875: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853716.27146: in VariableManager get_vars() 30583 1726853716.27161: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853716.28995: iterating over new_blocks loaded from include file 30583 1726853716.28997: in VariableManager get_vars() 30583 1726853716.29017: done with get_vars() 30583 1726853716.29019: filtering new block on tags 30583 1726853716.29347: done filtering new block on tags 30583 1726853716.29350: in VariableManager get_vars() 30583 1726853716.29362: done with get_vars() 30583 1726853716.29363: filtering new block on tags 30583 1726853716.29379: done filtering new block on tags 30583 1726853716.29383: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853716.29389: extending task lists for all hosts with included blocks 30583 1726853716.29531: done extending task lists 30583 1726853716.29532: done processing included files 30583 1726853716.29533: results queue empty 30583 1726853716.29534: checking for any_errors_fatal 30583 1726853716.29537: done checking for any_errors_fatal 30583 1726853716.29538: checking for max_fail_percentage 30583 1726853716.29539: done checking for max_fail_percentage 30583 1726853716.29540: checking to see if all hosts have failed and the running result is not ok 30583 1726853716.29541: done checking to see if all hosts have failed 30583 1726853716.29541: getting the remaining hosts for this loop 30583 1726853716.29543: done getting the remaining hosts for this loop 30583 1726853716.29547: getting the next task for host managed_node2 30583 1726853716.29555: done getting next task for host managed_node2 30583 1726853716.29558: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853716.29562: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853716.29574: getting variables 30583 1726853716.29575: in VariableManager get_vars() 30583 1726853716.29593: Calling all_inventory to load vars for managed_node2 30583 1726853716.29596: Calling groups_inventory to load vars for managed_node2 30583 1726853716.29598: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853716.29603: Calling all_plugins_play to load vars for managed_node2 30583 1726853716.29606: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853716.29609: Calling groups_plugins_play to load vars for managed_node2 30583 1726853716.30876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853716.32654: done with get_vars() 30583 1726853716.32677: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:35:16 -0400 (0:00:00.098) 0:00:51.665 ****** 30583 1726853716.32779: entering _queue_task() for managed_node2/include_tasks 30583 1726853716.33125: worker is 1 (out of 1 available) 30583 1726853716.33139: exiting _queue_task() for managed_node2/include_tasks 30583 1726853716.33153: done queuing things up, now waiting for results queue to drain 30583 1726853716.33154: waiting for pending results... 30583 1726853716.33358: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853716.33455: in run() - task 02083763-bbaf-05ea-abc5-0000000010f5 30583 1726853716.33470: variable 'ansible_search_path' from source: unknown 30583 1726853716.33475: variable 'ansible_search_path' from source: unknown 30583 1726853716.33508: calling self._execute() 30583 1726853716.33586: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853716.33589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853716.33600: variable 'omit' from source: magic vars 30583 1726853716.33884: variable 'ansible_distribution_major_version' from source: facts 30583 1726853716.33893: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853716.33898: _execute() done 30583 1726853716.33901: dumping result to json 30583 1726853716.33904: done dumping result, returning 30583 1726853716.33912: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-0000000010f5] 30583 1726853716.33914: sending task result for task 02083763-bbaf-05ea-abc5-0000000010f5 30583 1726853716.34003: done sending task result for task 02083763-bbaf-05ea-abc5-0000000010f5 30583 1726853716.34006: WORKER PROCESS EXITING 30583 1726853716.34074: no more pending results, returning what we have 30583 1726853716.34080: in VariableManager get_vars() 30583 1726853716.34125: Calling all_inventory to load vars for managed_node2 30583 1726853716.34128: Calling groups_inventory to load vars for managed_node2 30583 1726853716.34130: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853716.34144: Calling all_plugins_play to load vars for managed_node2 30583 1726853716.34147: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853716.34151: Calling groups_plugins_play to load vars for managed_node2 30583 1726853716.35717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853716.37291: done with get_vars() 30583 1726853716.37313: variable 'ansible_search_path' from source: unknown 30583 1726853716.37314: variable 'ansible_search_path' from source: unknown 30583 1726853716.37377: we have included files to process 30583 1726853716.37379: generating all_blocks data 30583 1726853716.37381: done generating all_blocks data 30583 1726853716.37384: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853716.37386: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853716.37388: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853716.38023: done processing included file 30583 1726853716.38025: iterating over new_blocks loaded from include file 30583 1726853716.38027: in VariableManager get_vars() 30583 1726853716.38052: done with get_vars() 30583 1726853716.38055: filtering new block on tags 30583 1726853716.38125: done filtering new block on tags 30583 1726853716.38129: in VariableManager get_vars() 30583 1726853716.38150: done with get_vars() 30583 1726853716.38152: filtering new block on tags 30583 1726853716.38208: done filtering new block on tags 30583 1726853716.38211: in VariableManager get_vars() 30583 1726853716.38233: done with get_vars() 30583 1726853716.38235: filtering new block on tags 30583 1726853716.38287: done filtering new block on tags 30583 1726853716.38290: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853716.38297: extending task lists for all hosts with included blocks 30583 1726853716.40219: done extending task lists 30583 1726853716.40221: done processing included files 30583 1726853716.40222: results queue empty 30583 1726853716.40223: checking for any_errors_fatal 30583 1726853716.40226: done checking for any_errors_fatal 30583 1726853716.40227: checking for max_fail_percentage 30583 1726853716.40228: done checking for max_fail_percentage 30583 1726853716.40229: checking to see if all hosts have failed and the running result is not ok 30583 1726853716.40230: done checking to see if all hosts have failed 30583 1726853716.40231: getting the remaining hosts for this loop 30583 1726853716.40232: done getting the remaining hosts for this loop 30583 1726853716.40236: getting the next task for host managed_node2 30583 1726853716.40241: done getting next task for host managed_node2 30583 1726853716.40244: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853716.40248: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853716.40263: getting variables 30583 1726853716.40264: in VariableManager get_vars() 30583 1726853716.40296: Calling all_inventory to load vars for managed_node2 30583 1726853716.40299: Calling groups_inventory to load vars for managed_node2 30583 1726853716.40301: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853716.40312: Calling all_plugins_play to load vars for managed_node2 30583 1726853716.40315: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853716.40319: Calling groups_plugins_play to load vars for managed_node2 30583 1726853716.41704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853716.43486: done with get_vars() 30583 1726853716.43518: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:35:16 -0400 (0:00:00.108) 0:00:51.773 ****** 30583 1726853716.43624: entering _queue_task() for managed_node2/setup 30583 1726853716.44065: worker is 1 (out of 1 available) 30583 1726853716.44216: exiting _queue_task() for managed_node2/setup 30583 1726853716.44228: done queuing things up, now waiting for results queue to drain 30583 1726853716.44230: waiting for pending results... 30583 1726853716.44585: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853716.44609: in run() - task 02083763-bbaf-05ea-abc5-000000001152 30583 1726853716.44630: variable 'ansible_search_path' from source: unknown 30583 1726853716.44641: variable 'ansible_search_path' from source: unknown 30583 1726853716.44694: calling self._execute() 30583 1726853716.44876: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853716.44882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853716.44893: variable 'omit' from source: magic vars 30583 1726853716.45270: variable 'ansible_distribution_major_version' from source: facts 30583 1726853716.45337: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853716.45526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853716.47878: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853716.47965: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853716.48018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853716.48117: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853716.48121: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853716.48224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853716.48269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853716.48308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853716.48365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853716.48443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853716.48446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853716.48485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853716.48526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853716.48585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853716.48614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853716.48831: variable '__network_required_facts' from source: role '' defaults 30583 1726853716.48852: variable 'ansible_facts' from source: unknown 30583 1726853716.49697: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853716.49705: when evaluation is False, skipping this task 30583 1726853716.49876: _execute() done 30583 1726853716.49879: dumping result to json 30583 1726853716.49882: done dumping result, returning 30583 1726853716.49884: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-000000001152] 30583 1726853716.49887: sending task result for task 02083763-bbaf-05ea-abc5-000000001152 30583 1726853716.49956: done sending task result for task 02083763-bbaf-05ea-abc5-000000001152 30583 1726853716.49959: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853716.50015: no more pending results, returning what we have 30583 1726853716.50019: results queue empty 30583 1726853716.50020: checking for any_errors_fatal 30583 1726853716.50021: done checking for any_errors_fatal 30583 1726853716.50022: checking for max_fail_percentage 30583 1726853716.50024: done checking for max_fail_percentage 30583 1726853716.50025: checking to see if all hosts have failed and the running result is not ok 30583 1726853716.50026: done checking to see if all hosts have failed 30583 1726853716.50027: getting the remaining hosts for this loop 30583 1726853716.50028: done getting the remaining hosts for this loop 30583 1726853716.50033: getting the next task for host managed_node2 30583 1726853716.50046: done getting next task for host managed_node2 30583 1726853716.50051: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853716.50058: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853716.50082: getting variables 30583 1726853716.50085: in VariableManager get_vars() 30583 1726853716.50125: Calling all_inventory to load vars for managed_node2 30583 1726853716.50127: Calling groups_inventory to load vars for managed_node2 30583 1726853716.50130: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853716.50140: Calling all_plugins_play to load vars for managed_node2 30583 1726853716.50143: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853716.50151: Calling groups_plugins_play to load vars for managed_node2 30583 1726853716.52592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853716.54806: done with get_vars() 30583 1726853716.54832: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:35:16 -0400 (0:00:00.113) 0:00:51.886 ****** 30583 1726853716.54942: entering _queue_task() for managed_node2/stat 30583 1726853716.55428: worker is 1 (out of 1 available) 30583 1726853716.55441: exiting _queue_task() for managed_node2/stat 30583 1726853716.55452: done queuing things up, now waiting for results queue to drain 30583 1726853716.55453: waiting for pending results... 30583 1726853716.55747: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853716.55828: in run() - task 02083763-bbaf-05ea-abc5-000000001154 30583 1726853716.55877: variable 'ansible_search_path' from source: unknown 30583 1726853716.55880: variable 'ansible_search_path' from source: unknown 30583 1726853716.55906: calling self._execute() 30583 1726853716.56061: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853716.56065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853716.56067: variable 'omit' from source: magic vars 30583 1726853716.56849: variable 'ansible_distribution_major_version' from source: facts 30583 1726853716.56868: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853716.57217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853716.57695: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853716.57759: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853716.57801: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853716.57847: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853716.57958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853716.58002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853716.58039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853716.58076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853716.58180: variable '__network_is_ostree' from source: set_fact 30583 1726853716.58198: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853716.58206: when evaluation is False, skipping this task 30583 1726853716.58213: _execute() done 30583 1726853716.58221: dumping result to json 30583 1726853716.58228: done dumping result, returning 30583 1726853716.58242: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-000000001154] 30583 1726853716.58259: sending task result for task 02083763-bbaf-05ea-abc5-000000001154 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853716.58534: no more pending results, returning what we have 30583 1726853716.58538: results queue empty 30583 1726853716.58539: checking for any_errors_fatal 30583 1726853716.58546: done checking for any_errors_fatal 30583 1726853716.58547: checking for max_fail_percentage 30583 1726853716.58549: done checking for max_fail_percentage 30583 1726853716.58550: checking to see if all hosts have failed and the running result is not ok 30583 1726853716.58551: done checking to see if all hosts have failed 30583 1726853716.58551: getting the remaining hosts for this loop 30583 1726853716.58554: done getting the remaining hosts for this loop 30583 1726853716.58558: getting the next task for host managed_node2 30583 1726853716.58569: done getting next task for host managed_node2 30583 1726853716.58575: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853716.58589: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853716.58601: done sending task result for task 02083763-bbaf-05ea-abc5-000000001154 30583 1726853716.58604: WORKER PROCESS EXITING 30583 1726853716.58621: getting variables 30583 1726853716.58622: in VariableManager get_vars() 30583 1726853716.58665: Calling all_inventory to load vars for managed_node2 30583 1726853716.58668: Calling groups_inventory to load vars for managed_node2 30583 1726853716.58802: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853716.58812: Calling all_plugins_play to load vars for managed_node2 30583 1726853716.58815: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853716.58817: Calling groups_plugins_play to load vars for managed_node2 30583 1726853716.61407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853716.63706: done with get_vars() 30583 1726853716.63729: done getting variables 30583 1726853716.63775: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:35:16 -0400 (0:00:00.088) 0:00:51.975 ****** 30583 1726853716.63824: entering _queue_task() for managed_node2/set_fact 30583 1726853716.64376: worker is 1 (out of 1 available) 30583 1726853716.64386: exiting _queue_task() for managed_node2/set_fact 30583 1726853716.64397: done queuing things up, now waiting for results queue to drain 30583 1726853716.64399: waiting for pending results... 30583 1726853716.65096: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853716.65175: in run() - task 02083763-bbaf-05ea-abc5-000000001155 30583 1726853716.65233: variable 'ansible_search_path' from source: unknown 30583 1726853716.65382: variable 'ansible_search_path' from source: unknown 30583 1726853716.65386: calling self._execute() 30583 1726853716.65523: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853716.65588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853716.65606: variable 'omit' from source: magic vars 30583 1726853716.66266: variable 'ansible_distribution_major_version' from source: facts 30583 1726853716.66311: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853716.66528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853716.66817: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853716.66876: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853716.66916: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853716.66963: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853716.67062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853716.67101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853716.67138: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853716.67179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853716.67285: variable '__network_is_ostree' from source: set_fact 30583 1726853716.67291: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853716.67300: when evaluation is False, skipping this task 30583 1726853716.67302: _execute() done 30583 1726853716.67304: dumping result to json 30583 1726853716.67307: done dumping result, returning 30583 1726853716.67310: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-000000001155] 30583 1726853716.67313: sending task result for task 02083763-bbaf-05ea-abc5-000000001155 30583 1726853716.67414: done sending task result for task 02083763-bbaf-05ea-abc5-000000001155 30583 1726853716.67418: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853716.67466: no more pending results, returning what we have 30583 1726853716.67470: results queue empty 30583 1726853716.67475: checking for any_errors_fatal 30583 1726853716.67482: done checking for any_errors_fatal 30583 1726853716.67483: checking for max_fail_percentage 30583 1726853716.67485: done checking for max_fail_percentage 30583 1726853716.67486: checking to see if all hosts have failed and the running result is not ok 30583 1726853716.67486: done checking to see if all hosts have failed 30583 1726853716.67487: getting the remaining hosts for this loop 30583 1726853716.67489: done getting the remaining hosts for this loop 30583 1726853716.67492: getting the next task for host managed_node2 30583 1726853716.67504: done getting next task for host managed_node2 30583 1726853716.67508: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853716.67514: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853716.67536: getting variables 30583 1726853716.67537: in VariableManager get_vars() 30583 1726853716.67577: Calling all_inventory to load vars for managed_node2 30583 1726853716.67583: Calling groups_inventory to load vars for managed_node2 30583 1726853716.67586: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853716.67596: Calling all_plugins_play to load vars for managed_node2 30583 1726853716.67599: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853716.67601: Calling groups_plugins_play to load vars for managed_node2 30583 1726853716.68422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853716.70119: done with get_vars() 30583 1726853716.70144: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:35:16 -0400 (0:00:00.064) 0:00:52.039 ****** 30583 1726853716.70258: entering _queue_task() for managed_node2/service_facts 30583 1726853716.70628: worker is 1 (out of 1 available) 30583 1726853716.70642: exiting _queue_task() for managed_node2/service_facts 30583 1726853716.70655: done queuing things up, now waiting for results queue to drain 30583 1726853716.70656: waiting for pending results... 30583 1726853716.70983: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853716.71061: in run() - task 02083763-bbaf-05ea-abc5-000000001157 30583 1726853716.71077: variable 'ansible_search_path' from source: unknown 30583 1726853716.71081: variable 'ansible_search_path' from source: unknown 30583 1726853716.71114: calling self._execute() 30583 1726853716.71220: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853716.71296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853716.71301: variable 'omit' from source: magic vars 30583 1726853716.71634: variable 'ansible_distribution_major_version' from source: facts 30583 1726853716.71645: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853716.71651: variable 'omit' from source: magic vars 30583 1726853716.71735: variable 'omit' from source: magic vars 30583 1726853716.71768: variable 'omit' from source: magic vars 30583 1726853716.71827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853716.71868: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853716.71898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853716.71918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853716.71938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853716.71969: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853716.71974: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853716.71977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853716.72123: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853716.72142: Set connection var ansible_timeout to 10 30583 1726853716.72146: Set connection var ansible_connection to ssh 30583 1726853716.72148: Set connection var ansible_shell_executable to /bin/sh 30583 1726853716.72150: Set connection var ansible_shell_type to sh 30583 1726853716.72177: Set connection var ansible_pipelining to False 30583 1726853716.72185: variable 'ansible_shell_executable' from source: unknown 30583 1726853716.72394: variable 'ansible_connection' from source: unknown 30583 1726853716.72398: variable 'ansible_module_compression' from source: unknown 30583 1726853716.72400: variable 'ansible_shell_type' from source: unknown 30583 1726853716.72402: variable 'ansible_shell_executable' from source: unknown 30583 1726853716.72404: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853716.72406: variable 'ansible_pipelining' from source: unknown 30583 1726853716.72408: variable 'ansible_timeout' from source: unknown 30583 1726853716.72410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853716.72498: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853716.72502: variable 'omit' from source: magic vars 30583 1726853716.72505: starting attempt loop 30583 1726853716.72507: running the handler 30583 1726853716.72509: _low_level_execute_command(): starting 30583 1726853716.72511: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853716.73478: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853716.73482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853716.73484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853716.73486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853716.73488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853716.73491: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853716.73493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853716.73495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853716.73539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853716.75275: stdout chunk (state=3): >>>/root <<< 30583 1726853716.75473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853716.75677: stderr chunk (state=3): >>><<< 30583 1726853716.75680: stdout chunk (state=3): >>><<< 30583 1726853716.75685: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853716.75688: _low_level_execute_command(): starting 30583 1726853716.75691: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271 `" && echo ansible-tmp-1726853716.7551663-33059-106403397617271="` echo /root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271 `" ) && sleep 0' 30583 1726853716.76162: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853716.76265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853716.76477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853716.76555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853716.78603: stdout chunk (state=3): >>>ansible-tmp-1726853716.7551663-33059-106403397617271=/root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271 <<< 30583 1726853716.78775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853716.78779: stdout chunk (state=3): >>><<< 30583 1726853716.78781: stderr chunk (state=3): >>><<< 30583 1726853716.78978: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853716.7551663-33059-106403397617271=/root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853716.78982: variable 'ansible_module_compression' from source: unknown 30583 1726853716.78984: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853716.78987: variable 'ansible_facts' from source: unknown 30583 1726853716.79054: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271/AnsiballZ_service_facts.py 30583 1726853716.79236: Sending initial data 30583 1726853716.79246: Sent initial data (162 bytes) 30583 1726853716.79921: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853716.80002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853716.80148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853716.80191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853716.80297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853716.80516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853716.82241: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853716.82380: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853716.82500: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpdrgm_306 /root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271/AnsiballZ_service_facts.py <<< 30583 1726853716.82504: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271/AnsiballZ_service_facts.py" <<< 30583 1726853716.82889: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpdrgm_306" to remote "/root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271/AnsiballZ_service_facts.py" <<< 30583 1726853716.83567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853716.83588: stderr chunk (state=3): >>><<< 30583 1726853716.83605: stdout chunk (state=3): >>><<< 30583 1726853716.83701: done transferring module to remote 30583 1726853716.83735: _low_level_execute_command(): starting 30583 1726853716.83748: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271/ /root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271/AnsiballZ_service_facts.py && sleep 0' 30583 1726853716.84327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853716.84338: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853716.84395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853716.84400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853716.84463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853716.86378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853716.86487: stdout chunk (state=3): >>><<< 30583 1726853716.86491: stderr chunk (state=3): >>><<< 30583 1726853716.86494: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853716.86500: _low_level_execute_command(): starting 30583 1726853716.86503: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271/AnsiballZ_service_facts.py && sleep 0' 30583 1726853716.87097: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853716.87101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853716.87103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853716.87143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853716.87155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853716.87232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853718.53431: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853718.55088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853718.55100: stdout chunk (state=3): >>><<< 30583 1726853718.55113: stderr chunk (state=3): >>><<< 30583 1726853718.55142: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853718.56354: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853718.56359: _low_level_execute_command(): starting 30583 1726853718.56362: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853716.7551663-33059-106403397617271/ > /dev/null 2>&1 && sleep 0' 30583 1726853718.57143: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853718.57221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853718.57303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853718.59249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853718.59252: stdout chunk (state=3): >>><<< 30583 1726853718.59260: stderr chunk (state=3): >>><<< 30583 1726853718.59288: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853718.59293: handler run complete 30583 1726853718.59540: variable 'ansible_facts' from source: unknown 30583 1726853718.59860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853718.60326: variable 'ansible_facts' from source: unknown 30583 1726853718.60482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853718.60721: attempt loop complete, returning result 30583 1726853718.60725: _execute() done 30583 1726853718.60728: dumping result to json 30583 1726853718.60811: done dumping result, returning 30583 1726853718.60814: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-000000001157] 30583 1726853718.60816: sending task result for task 02083763-bbaf-05ea-abc5-000000001157 30583 1726853718.62155: done sending task result for task 02083763-bbaf-05ea-abc5-000000001157 30583 1726853718.62159: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853718.62319: no more pending results, returning what we have 30583 1726853718.62322: results queue empty 30583 1726853718.62323: checking for any_errors_fatal 30583 1726853718.62326: done checking for any_errors_fatal 30583 1726853718.62327: checking for max_fail_percentage 30583 1726853718.62329: done checking for max_fail_percentage 30583 1726853718.62329: checking to see if all hosts have failed and the running result is not ok 30583 1726853718.62330: done checking to see if all hosts have failed 30583 1726853718.62331: getting the remaining hosts for this loop 30583 1726853718.62337: done getting the remaining hosts for this loop 30583 1726853718.62341: getting the next task for host managed_node2 30583 1726853718.62349: done getting next task for host managed_node2 30583 1726853718.62353: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853718.62361: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853718.62381: getting variables 30583 1726853718.62383: in VariableManager get_vars() 30583 1726853718.62416: Calling all_inventory to load vars for managed_node2 30583 1726853718.62419: Calling groups_inventory to load vars for managed_node2 30583 1726853718.62421: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853718.62430: Calling all_plugins_play to load vars for managed_node2 30583 1726853718.62433: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853718.62436: Calling groups_plugins_play to load vars for managed_node2 30583 1726853718.64105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853718.65750: done with get_vars() 30583 1726853718.65769: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:35:18 -0400 (0:00:01.955) 0:00:53.995 ****** 30583 1726853718.65845: entering _queue_task() for managed_node2/package_facts 30583 1726853718.66103: worker is 1 (out of 1 available) 30583 1726853718.66118: exiting _queue_task() for managed_node2/package_facts 30583 1726853718.66131: done queuing things up, now waiting for results queue to drain 30583 1726853718.66133: waiting for pending results... 30583 1726853718.66325: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853718.66431: in run() - task 02083763-bbaf-05ea-abc5-000000001158 30583 1726853718.66443: variable 'ansible_search_path' from source: unknown 30583 1726853718.66446: variable 'ansible_search_path' from source: unknown 30583 1726853718.66480: calling self._execute() 30583 1726853718.66557: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853718.66561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853718.66573: variable 'omit' from source: magic vars 30583 1726853718.66850: variable 'ansible_distribution_major_version' from source: facts 30583 1726853718.66859: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853718.66867: variable 'omit' from source: magic vars 30583 1726853718.66954: variable 'omit' from source: magic vars 30583 1726853718.66969: variable 'omit' from source: magic vars 30583 1726853718.67005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853718.67053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853718.67121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853718.67124: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853718.67127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853718.67129: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853718.67132: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853718.67134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853718.67308: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853718.67311: Set connection var ansible_timeout to 10 30583 1726853718.67314: Set connection var ansible_connection to ssh 30583 1726853718.67316: Set connection var ansible_shell_executable to /bin/sh 30583 1726853718.67319: Set connection var ansible_shell_type to sh 30583 1726853718.67320: Set connection var ansible_pipelining to False 30583 1726853718.67322: variable 'ansible_shell_executable' from source: unknown 30583 1726853718.67324: variable 'ansible_connection' from source: unknown 30583 1726853718.67327: variable 'ansible_module_compression' from source: unknown 30583 1726853718.67329: variable 'ansible_shell_type' from source: unknown 30583 1726853718.67332: variable 'ansible_shell_executable' from source: unknown 30583 1726853718.67333: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853718.67335: variable 'ansible_pipelining' from source: unknown 30583 1726853718.67337: variable 'ansible_timeout' from source: unknown 30583 1726853718.67339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853718.67598: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853718.67602: variable 'omit' from source: magic vars 30583 1726853718.67605: starting attempt loop 30583 1726853718.67607: running the handler 30583 1726853718.67609: _low_level_execute_command(): starting 30583 1726853718.67611: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853718.68193: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853718.68209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853718.68218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853718.68230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853718.68243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853718.68248: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853718.68260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853718.68276: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853718.68285: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853718.68291: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853718.68301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853718.68312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853718.68323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853718.68331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853718.68355: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853718.68358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853718.68423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853718.68445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853718.68520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853718.70244: stdout chunk (state=3): >>>/root <<< 30583 1726853718.70345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853718.70376: stderr chunk (state=3): >>><<< 30583 1726853718.70380: stdout chunk (state=3): >>><<< 30583 1726853718.70401: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853718.70421: _low_level_execute_command(): starting 30583 1726853718.70426: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859 `" && echo ansible-tmp-1726853718.704002-33153-206309799633859="` echo /root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859 `" ) && sleep 0' 30583 1726853718.71031: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853718.71118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853718.71146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853718.71250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853718.73272: stdout chunk (state=3): >>>ansible-tmp-1726853718.704002-33153-206309799633859=/root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859 <<< 30583 1726853718.73394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853718.73434: stderr chunk (state=3): >>><<< 30583 1726853718.73437: stdout chunk (state=3): >>><<< 30583 1726853718.73467: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853718.704002-33153-206309799633859=/root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853718.73524: variable 'ansible_module_compression' from source: unknown 30583 1726853718.73585: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853718.73638: variable 'ansible_facts' from source: unknown 30583 1726853718.73825: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859/AnsiballZ_package_facts.py 30583 1726853718.74052: Sending initial data 30583 1726853718.74055: Sent initial data (161 bytes) 30583 1726853718.74512: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853718.74521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853718.74533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853718.74544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853718.74677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853718.74681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853718.74748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853718.76417: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853718.76422: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853718.76484: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853718.76555: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpiqh9t6q3 /root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859/AnsiballZ_package_facts.py <<< 30583 1726853718.76561: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859/AnsiballZ_package_facts.py" <<< 30583 1726853718.76621: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpiqh9t6q3" to remote "/root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859/AnsiballZ_package_facts.py" <<< 30583 1726853718.76626: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859/AnsiballZ_package_facts.py" <<< 30583 1726853718.77988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853718.78034: stderr chunk (state=3): >>><<< 30583 1726853718.78037: stdout chunk (state=3): >>><<< 30583 1726853718.78050: done transferring module to remote 30583 1726853718.78061: _low_level_execute_command(): starting 30583 1726853718.78067: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859/ /root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859/AnsiballZ_package_facts.py && sleep 0' 30583 1726853718.78516: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853718.78519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853718.78525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853718.78527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853718.78530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853718.78577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853718.78587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853718.78655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853718.80584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853718.80611: stderr chunk (state=3): >>><<< 30583 1726853718.80614: stdout chunk (state=3): >>><<< 30583 1726853718.80661: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853718.80665: _low_level_execute_command(): starting 30583 1726853718.80672: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859/AnsiballZ_package_facts.py && sleep 0' 30583 1726853718.81268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853718.81273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853718.81275: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853718.81277: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853718.81279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853718.81334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853718.81336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853718.81410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853719.26728: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30583 1726853719.26765: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30583 1726853719.26793: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30583 1726853719.26850: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30583 1726853719.26869: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30583 1726853719.26897: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30583 1726853719.26901: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30583 1726853719.26939: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853719.28769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853719.28799: stderr chunk (state=3): >>><<< 30583 1726853719.28803: stdout chunk (state=3): >>><<< 30583 1726853719.28840: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853719.30102: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853719.30119: _low_level_execute_command(): starting 30583 1726853719.30123: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853718.704002-33153-206309799633859/ > /dev/null 2>&1 && sleep 0' 30583 1726853719.30953: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853719.30972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853719.31089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853719.33005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853719.33052: stderr chunk (state=3): >>><<< 30583 1726853719.33055: stdout chunk (state=3): >>><<< 30583 1726853719.33067: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853719.33077: handler run complete 30583 1726853719.33621: variable 'ansible_facts' from source: unknown 30583 1726853719.34077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853719.35921: variable 'ansible_facts' from source: unknown 30583 1726853719.36353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853719.36965: attempt loop complete, returning result 30583 1726853719.36986: _execute() done 30583 1726853719.36994: dumping result to json 30583 1726853719.37197: done dumping result, returning 30583 1726853719.37214: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-000000001158] 30583 1726853719.37223: sending task result for task 02083763-bbaf-05ea-abc5-000000001158 30583 1726853719.39598: done sending task result for task 02083763-bbaf-05ea-abc5-000000001158 30583 1726853719.39602: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853719.39848: no more pending results, returning what we have 30583 1726853719.39851: results queue empty 30583 1726853719.39852: checking for any_errors_fatal 30583 1726853719.39859: done checking for any_errors_fatal 30583 1726853719.39860: checking for max_fail_percentage 30583 1726853719.39862: done checking for max_fail_percentage 30583 1726853719.39863: checking to see if all hosts have failed and the running result is not ok 30583 1726853719.39863: done checking to see if all hosts have failed 30583 1726853719.39864: getting the remaining hosts for this loop 30583 1726853719.39866: done getting the remaining hosts for this loop 30583 1726853719.39875: getting the next task for host managed_node2 30583 1726853719.39883: done getting next task for host managed_node2 30583 1726853719.39887: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853719.39893: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853719.39906: getting variables 30583 1726853719.39907: in VariableManager get_vars() 30583 1726853719.39947: Calling all_inventory to load vars for managed_node2 30583 1726853719.39949: Calling groups_inventory to load vars for managed_node2 30583 1726853719.39951: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853719.39966: Calling all_plugins_play to load vars for managed_node2 30583 1726853719.39969: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853719.39978: Calling groups_plugins_play to load vars for managed_node2 30583 1726853719.41105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853719.43114: done with get_vars() 30583 1726853719.43145: done getting variables 30583 1726853719.43337: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:19 -0400 (0:00:00.776) 0:00:54.772 ****** 30583 1726853719.43516: entering _queue_task() for managed_node2/debug 30583 1726853719.43976: worker is 1 (out of 1 available) 30583 1726853719.43992: exiting _queue_task() for managed_node2/debug 30583 1726853719.44004: done queuing things up, now waiting for results queue to drain 30583 1726853719.44005: waiting for pending results... 30583 1726853719.44241: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853719.44335: in run() - task 02083763-bbaf-05ea-abc5-0000000010f6 30583 1726853719.44347: variable 'ansible_search_path' from source: unknown 30583 1726853719.44351: variable 'ansible_search_path' from source: unknown 30583 1726853719.44385: calling self._execute() 30583 1726853719.44465: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853719.44470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853719.44480: variable 'omit' from source: magic vars 30583 1726853719.44766: variable 'ansible_distribution_major_version' from source: facts 30583 1726853719.44770: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853719.44780: variable 'omit' from source: magic vars 30583 1726853719.44825: variable 'omit' from source: magic vars 30583 1726853719.44898: variable 'network_provider' from source: set_fact 30583 1726853719.44914: variable 'omit' from source: magic vars 30583 1726853719.44947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853719.44981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853719.44997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853719.45010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853719.45022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853719.45047: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853719.45050: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853719.45052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853719.45130: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853719.45134: Set connection var ansible_timeout to 10 30583 1726853719.45136: Set connection var ansible_connection to ssh 30583 1726853719.45138: Set connection var ansible_shell_executable to /bin/sh 30583 1726853719.45141: Set connection var ansible_shell_type to sh 30583 1726853719.45148: Set connection var ansible_pipelining to False 30583 1726853719.45168: variable 'ansible_shell_executable' from source: unknown 30583 1726853719.45173: variable 'ansible_connection' from source: unknown 30583 1726853719.45176: variable 'ansible_module_compression' from source: unknown 30583 1726853719.45179: variable 'ansible_shell_type' from source: unknown 30583 1726853719.45181: variable 'ansible_shell_executable' from source: unknown 30583 1726853719.45183: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853719.45186: variable 'ansible_pipelining' from source: unknown 30583 1726853719.45189: variable 'ansible_timeout' from source: unknown 30583 1726853719.45191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853719.45295: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853719.45309: variable 'omit' from source: magic vars 30583 1726853719.45312: starting attempt loop 30583 1726853719.45315: running the handler 30583 1726853719.45350: handler run complete 30583 1726853719.45363: attempt loop complete, returning result 30583 1726853719.45366: _execute() done 30583 1726853719.45368: dumping result to json 30583 1726853719.45372: done dumping result, returning 30583 1726853719.45378: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-0000000010f6] 30583 1726853719.45381: sending task result for task 02083763-bbaf-05ea-abc5-0000000010f6 30583 1726853719.45474: done sending task result for task 02083763-bbaf-05ea-abc5-0000000010f6 30583 1726853719.45477: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853719.45581: no more pending results, returning what we have 30583 1726853719.45591: results queue empty 30583 1726853719.45592: checking for any_errors_fatal 30583 1726853719.45603: done checking for any_errors_fatal 30583 1726853719.45604: checking for max_fail_percentage 30583 1726853719.45606: done checking for max_fail_percentage 30583 1726853719.45607: checking to see if all hosts have failed and the running result is not ok 30583 1726853719.45608: done checking to see if all hosts have failed 30583 1726853719.45609: getting the remaining hosts for this loop 30583 1726853719.45610: done getting the remaining hosts for this loop 30583 1726853719.45614: getting the next task for host managed_node2 30583 1726853719.45623: done getting next task for host managed_node2 30583 1726853719.45627: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853719.45632: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853719.45644: getting variables 30583 1726853719.45646: in VariableManager get_vars() 30583 1726853719.45691: Calling all_inventory to load vars for managed_node2 30583 1726853719.45701: Calling groups_inventory to load vars for managed_node2 30583 1726853719.45704: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853719.45714: Calling all_plugins_play to load vars for managed_node2 30583 1726853719.45717: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853719.45720: Calling groups_plugins_play to load vars for managed_node2 30583 1726853719.48867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853719.50754: done with get_vars() 30583 1726853719.50806: done getting variables 30583 1726853719.50885: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:19 -0400 (0:00:00.074) 0:00:54.846 ****** 30583 1726853719.50975: entering _queue_task() for managed_node2/fail 30583 1726853719.51486: worker is 1 (out of 1 available) 30583 1726853719.51498: exiting _queue_task() for managed_node2/fail 30583 1726853719.51510: done queuing things up, now waiting for results queue to drain 30583 1726853719.51512: waiting for pending results... 30583 1726853719.51902: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853719.51999: in run() - task 02083763-bbaf-05ea-abc5-0000000010f7 30583 1726853719.52003: variable 'ansible_search_path' from source: unknown 30583 1726853719.52076: variable 'ansible_search_path' from source: unknown 30583 1726853719.52080: calling self._execute() 30583 1726853719.52284: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853719.52426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853719.52429: variable 'omit' from source: magic vars 30583 1726853719.53309: variable 'ansible_distribution_major_version' from source: facts 30583 1726853719.53331: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853719.53577: variable 'network_state' from source: role '' defaults 30583 1726853719.53607: Evaluated conditional (network_state != {}): False 30583 1726853719.53616: when evaluation is False, skipping this task 30583 1726853719.53623: _execute() done 30583 1726853719.53633: dumping result to json 30583 1726853719.53673: done dumping result, returning 30583 1726853719.53690: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-0000000010f7] 30583 1726853719.53701: sending task result for task 02083763-bbaf-05ea-abc5-0000000010f7 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853719.54012: no more pending results, returning what we have 30583 1726853719.54017: results queue empty 30583 1726853719.54018: checking for any_errors_fatal 30583 1726853719.54026: done checking for any_errors_fatal 30583 1726853719.54027: checking for max_fail_percentage 30583 1726853719.54029: done checking for max_fail_percentage 30583 1726853719.54030: checking to see if all hosts have failed and the running result is not ok 30583 1726853719.54031: done checking to see if all hosts have failed 30583 1726853719.54032: getting the remaining hosts for this loop 30583 1726853719.54034: done getting the remaining hosts for this loop 30583 1726853719.54038: getting the next task for host managed_node2 30583 1726853719.54049: done getting next task for host managed_node2 30583 1726853719.54053: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853719.54059: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853719.54088: getting variables 30583 1726853719.54091: in VariableManager get_vars() 30583 1726853719.54134: Calling all_inventory to load vars for managed_node2 30583 1726853719.54137: Calling groups_inventory to load vars for managed_node2 30583 1726853719.54140: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853719.54152: Calling all_plugins_play to load vars for managed_node2 30583 1726853719.54155: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853719.54158: Calling groups_plugins_play to load vars for managed_node2 30583 1726853719.54776: done sending task result for task 02083763-bbaf-05ea-abc5-0000000010f7 30583 1726853719.54780: WORKER PROCESS EXITING 30583 1726853719.56824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853719.59067: done with get_vars() 30583 1726853719.59092: done getting variables 30583 1726853719.59204: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:19 -0400 (0:00:00.082) 0:00:54.929 ****** 30583 1726853719.59240: entering _queue_task() for managed_node2/fail 30583 1726853719.59701: worker is 1 (out of 1 available) 30583 1726853719.59714: exiting _queue_task() for managed_node2/fail 30583 1726853719.59726: done queuing things up, now waiting for results queue to drain 30583 1726853719.59727: waiting for pending results... 30583 1726853719.60665: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853719.61048: in run() - task 02083763-bbaf-05ea-abc5-0000000010f8 30583 1726853719.61102: variable 'ansible_search_path' from source: unknown 30583 1726853719.61169: variable 'ansible_search_path' from source: unknown 30583 1726853719.61226: calling self._execute() 30583 1726853719.61446: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853719.61529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853719.61546: variable 'omit' from source: magic vars 30583 1726853719.62008: variable 'ansible_distribution_major_version' from source: facts 30583 1726853719.62025: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853719.62148: variable 'network_state' from source: role '' defaults 30583 1726853719.62169: Evaluated conditional (network_state != {}): False 30583 1726853719.62180: when evaluation is False, skipping this task 30583 1726853719.62187: _execute() done 30583 1726853719.62197: dumping result to json 30583 1726853719.62204: done dumping result, returning 30583 1726853719.62216: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-0000000010f8] 30583 1726853719.62225: sending task result for task 02083763-bbaf-05ea-abc5-0000000010f8 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853719.62380: no more pending results, returning what we have 30583 1726853719.62384: results queue empty 30583 1726853719.62385: checking for any_errors_fatal 30583 1726853719.62391: done checking for any_errors_fatal 30583 1726853719.62391: checking for max_fail_percentage 30583 1726853719.62393: done checking for max_fail_percentage 30583 1726853719.62394: checking to see if all hosts have failed and the running result is not ok 30583 1726853719.62395: done checking to see if all hosts have failed 30583 1726853719.62396: getting the remaining hosts for this loop 30583 1726853719.62397: done getting the remaining hosts for this loop 30583 1726853719.62401: getting the next task for host managed_node2 30583 1726853719.62410: done getting next task for host managed_node2 30583 1726853719.62413: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853719.62419: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853719.62442: getting variables 30583 1726853719.62444: in VariableManager get_vars() 30583 1726853719.62486: Calling all_inventory to load vars for managed_node2 30583 1726853719.62489: Calling groups_inventory to load vars for managed_node2 30583 1726853719.62491: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853719.62504: Calling all_plugins_play to load vars for managed_node2 30583 1726853719.62506: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853719.62510: Calling groups_plugins_play to load vars for managed_node2 30583 1726853719.63032: done sending task result for task 02083763-bbaf-05ea-abc5-0000000010f8 30583 1726853719.63037: WORKER PROCESS EXITING 30583 1726853719.65061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853719.67474: done with get_vars() 30583 1726853719.67509: done getting variables 30583 1726853719.67578: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:19 -0400 (0:00:00.083) 0:00:55.013 ****** 30583 1726853719.67616: entering _queue_task() for managed_node2/fail 30583 1726853719.67998: worker is 1 (out of 1 available) 30583 1726853719.68012: exiting _queue_task() for managed_node2/fail 30583 1726853719.68025: done queuing things up, now waiting for results queue to drain 30583 1726853719.68027: waiting for pending results... 30583 1726853719.68260: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853719.68496: in run() - task 02083763-bbaf-05ea-abc5-0000000010f9 30583 1726853719.68500: variable 'ansible_search_path' from source: unknown 30583 1726853719.68504: variable 'ansible_search_path' from source: unknown 30583 1726853719.68507: calling self._execute() 30583 1726853719.68578: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853719.68591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853719.68611: variable 'omit' from source: magic vars 30583 1726853719.69179: variable 'ansible_distribution_major_version' from source: facts 30583 1726853719.69392: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853719.69734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853719.72391: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853719.72480: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853719.72525: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853719.72654: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853719.72775: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853719.72869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853719.73006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853719.73011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853719.73121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853719.73141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853719.73345: variable 'ansible_distribution_major_version' from source: facts 30583 1726853719.73417: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853719.73898: variable 'ansible_distribution' from source: facts 30583 1726853719.73902: variable '__network_rh_distros' from source: role '' defaults 30583 1726853719.73904: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853719.74324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853719.74444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853719.74521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853719.74778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853719.74781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853719.74951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853719.74987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853719.75076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853719.75198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853719.75387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853719.75444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853719.75578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853719.75891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853719.75895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853719.75899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853719.76685: variable 'network_connections' from source: include params 30583 1726853719.76702: variable 'interface' from source: play vars 30583 1726853719.76800: variable 'interface' from source: play vars 30583 1726853719.76838: variable 'network_state' from source: role '' defaults 30583 1726853719.76979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853719.77202: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853719.77263: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853719.77308: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853719.77341: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853719.77431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853719.77485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853719.77528: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853719.77574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853719.77638: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853719.77647: when evaluation is False, skipping this task 30583 1726853719.77676: _execute() done 30583 1726853719.77679: dumping result to json 30583 1726853719.77681: done dumping result, returning 30583 1726853719.77684: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-0000000010f9] 30583 1726853719.77693: sending task result for task 02083763-bbaf-05ea-abc5-0000000010f9 skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853719.77892: no more pending results, returning what we have 30583 1726853719.77895: results queue empty 30583 1726853719.77896: checking for any_errors_fatal 30583 1726853719.77907: done checking for any_errors_fatal 30583 1726853719.77907: checking for max_fail_percentage 30583 1726853719.77909: done checking for max_fail_percentage 30583 1726853719.77910: checking to see if all hosts have failed and the running result is not ok 30583 1726853719.77911: done checking to see if all hosts have failed 30583 1726853719.77911: getting the remaining hosts for this loop 30583 1726853719.77914: done getting the remaining hosts for this loop 30583 1726853719.77917: getting the next task for host managed_node2 30583 1726853719.77926: done getting next task for host managed_node2 30583 1726853719.77929: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853719.77934: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853719.77956: getting variables 30583 1726853719.77958: in VariableManager get_vars() 30583 1726853719.78094: Calling all_inventory to load vars for managed_node2 30583 1726853719.78097: Calling groups_inventory to load vars for managed_node2 30583 1726853719.78100: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853719.78111: Calling all_plugins_play to load vars for managed_node2 30583 1726853719.78114: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853719.78118: Calling groups_plugins_play to load vars for managed_node2 30583 1726853719.78636: done sending task result for task 02083763-bbaf-05ea-abc5-0000000010f9 30583 1726853719.78639: WORKER PROCESS EXITING 30583 1726853719.79101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853719.89367: done with get_vars() 30583 1726853719.89407: done getting variables 30583 1726853719.89462: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:19 -0400 (0:00:00.218) 0:00:55.232 ****** 30583 1726853719.89494: entering _queue_task() for managed_node2/dnf 30583 1726853719.89882: worker is 1 (out of 1 available) 30583 1726853719.89896: exiting _queue_task() for managed_node2/dnf 30583 1726853719.89910: done queuing things up, now waiting for results queue to drain 30583 1726853719.89913: waiting for pending results... 30583 1726853719.90269: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853719.90382: in run() - task 02083763-bbaf-05ea-abc5-0000000010fa 30583 1726853719.90404: variable 'ansible_search_path' from source: unknown 30583 1726853719.90411: variable 'ansible_search_path' from source: unknown 30583 1726853719.90451: calling self._execute() 30583 1726853719.90566: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853719.90587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853719.90604: variable 'omit' from source: magic vars 30583 1726853719.91013: variable 'ansible_distribution_major_version' from source: facts 30583 1726853719.91016: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853719.91273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853719.93574: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853719.93660: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853719.93705: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853719.93750: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853719.93786: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853719.93878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853719.93957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853719.93963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853719.93997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853719.94017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853719.94146: variable 'ansible_distribution' from source: facts 30583 1726853719.94160: variable 'ansible_distribution_major_version' from source: facts 30583 1726853719.94276: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853719.94315: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853719.94450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853719.94484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853719.94517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853719.94564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853719.94585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853719.94633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853719.94666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853719.94697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853719.94747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853719.94769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853719.94847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853719.94868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853719.94898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853719.94943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853719.94966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853719.95129: variable 'network_connections' from source: include params 30583 1726853719.95147: variable 'interface' from source: play vars 30583 1726853719.95266: variable 'interface' from source: play vars 30583 1726853719.95306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853719.95509: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853719.95552: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853719.95598: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853719.95633: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853719.95687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853719.95876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853719.95888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853719.95891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853719.95894: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853719.96093: variable 'network_connections' from source: include params 30583 1726853719.96104: variable 'interface' from source: play vars 30583 1726853719.96173: variable 'interface' from source: play vars 30583 1726853719.96207: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853719.96215: when evaluation is False, skipping this task 30583 1726853719.96227: _execute() done 30583 1726853719.96235: dumping result to json 30583 1726853719.96243: done dumping result, returning 30583 1726853719.96254: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000010fa] 30583 1726853719.96267: sending task result for task 02083763-bbaf-05ea-abc5-0000000010fa skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853719.96426: no more pending results, returning what we have 30583 1726853719.96430: results queue empty 30583 1726853719.96431: checking for any_errors_fatal 30583 1726853719.96440: done checking for any_errors_fatal 30583 1726853719.96441: checking for max_fail_percentage 30583 1726853719.96443: done checking for max_fail_percentage 30583 1726853719.96444: checking to see if all hosts have failed and the running result is not ok 30583 1726853719.96445: done checking to see if all hosts have failed 30583 1726853719.96445: getting the remaining hosts for this loop 30583 1726853719.96447: done getting the remaining hosts for this loop 30583 1726853719.96451: getting the next task for host managed_node2 30583 1726853719.96459: done getting next task for host managed_node2 30583 1726853719.96463: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853719.96468: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853719.96497: getting variables 30583 1726853719.96499: in VariableManager get_vars() 30583 1726853719.96536: Calling all_inventory to load vars for managed_node2 30583 1726853719.96539: Calling groups_inventory to load vars for managed_node2 30583 1726853719.96541: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853719.96551: Calling all_plugins_play to load vars for managed_node2 30583 1726853719.96555: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853719.96558: Calling groups_plugins_play to load vars for managed_node2 30583 1726853719.97174: done sending task result for task 02083763-bbaf-05ea-abc5-0000000010fa 30583 1726853719.97178: WORKER PROCESS EXITING 30583 1726853719.98183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853719.99083: done with get_vars() 30583 1726853719.99103: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853719.99163: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:19 -0400 (0:00:00.096) 0:00:55.329 ****** 30583 1726853719.99191: entering _queue_task() for managed_node2/yum 30583 1726853719.99455: worker is 1 (out of 1 available) 30583 1726853719.99470: exiting _queue_task() for managed_node2/yum 30583 1726853719.99485: done queuing things up, now waiting for results queue to drain 30583 1726853719.99487: waiting for pending results... 30583 1726853719.99683: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853719.99786: in run() - task 02083763-bbaf-05ea-abc5-0000000010fb 30583 1726853719.99797: variable 'ansible_search_path' from source: unknown 30583 1726853719.99801: variable 'ansible_search_path' from source: unknown 30583 1726853719.99875: calling self._execute() 30583 1726853719.99993: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853720.00000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853720.00003: variable 'omit' from source: magic vars 30583 1726853720.00391: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.00394: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853720.00561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853720.02945: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853720.03304: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853720.03335: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853720.03361: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853720.03385: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853720.03442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.03469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.03488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.03513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.03524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.03598: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.03612: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853720.03616: when evaluation is False, skipping this task 30583 1726853720.03618: _execute() done 30583 1726853720.03621: dumping result to json 30583 1726853720.03624: done dumping result, returning 30583 1726853720.03632: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000010fb] 30583 1726853720.03637: sending task result for task 02083763-bbaf-05ea-abc5-0000000010fb 30583 1726853720.03731: done sending task result for task 02083763-bbaf-05ea-abc5-0000000010fb 30583 1726853720.03734: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853720.03785: no more pending results, returning what we have 30583 1726853720.03790: results queue empty 30583 1726853720.03791: checking for any_errors_fatal 30583 1726853720.03796: done checking for any_errors_fatal 30583 1726853720.03797: checking for max_fail_percentage 30583 1726853720.03799: done checking for max_fail_percentage 30583 1726853720.03800: checking to see if all hosts have failed and the running result is not ok 30583 1726853720.03800: done checking to see if all hosts have failed 30583 1726853720.03801: getting the remaining hosts for this loop 30583 1726853720.03803: done getting the remaining hosts for this loop 30583 1726853720.03806: getting the next task for host managed_node2 30583 1726853720.03814: done getting next task for host managed_node2 30583 1726853720.03818: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853720.03823: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853720.03844: getting variables 30583 1726853720.03846: in VariableManager get_vars() 30583 1726853720.03886: Calling all_inventory to load vars for managed_node2 30583 1726853720.03888: Calling groups_inventory to load vars for managed_node2 30583 1726853720.03891: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853720.03900: Calling all_plugins_play to load vars for managed_node2 30583 1726853720.03903: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853720.03944: Calling groups_plugins_play to load vars for managed_node2 30583 1726853720.05540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853720.07243: done with get_vars() 30583 1726853720.07273: done getting variables 30583 1726853720.07338: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:20 -0400 (0:00:00.081) 0:00:55.410 ****** 30583 1726853720.07379: entering _queue_task() for managed_node2/fail 30583 1726853720.08105: worker is 1 (out of 1 available) 30583 1726853720.08114: exiting _queue_task() for managed_node2/fail 30583 1726853720.08125: done queuing things up, now waiting for results queue to drain 30583 1726853720.08126: waiting for pending results... 30583 1726853720.08394: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853720.08407: in run() - task 02083763-bbaf-05ea-abc5-0000000010fc 30583 1726853720.08423: variable 'ansible_search_path' from source: unknown 30583 1726853720.08427: variable 'ansible_search_path' from source: unknown 30583 1726853720.08480: calling self._execute() 30583 1726853720.08601: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853720.08605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853720.08608: variable 'omit' from source: magic vars 30583 1726853720.08988: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.08999: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853720.09122: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853720.09316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853720.11377: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853720.11433: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853720.11463: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853720.11490: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853720.11509: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853720.11572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.11593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.11611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.11636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.11646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.11685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.11701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.11718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.11742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.11752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.11786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.11802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.11817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.11842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.11851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.11970: variable 'network_connections' from source: include params 30583 1726853720.11985: variable 'interface' from source: play vars 30583 1726853720.12033: variable 'interface' from source: play vars 30583 1726853720.12087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853720.12199: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853720.12236: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853720.12261: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853720.12283: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853720.12319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853720.12334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853720.12351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.12370: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853720.12421: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853720.12581: variable 'network_connections' from source: include params 30583 1726853720.12585: variable 'interface' from source: play vars 30583 1726853720.12629: variable 'interface' from source: play vars 30583 1726853720.12656: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853720.12662: when evaluation is False, skipping this task 30583 1726853720.12665: _execute() done 30583 1726853720.12667: dumping result to json 30583 1726853720.12669: done dumping result, returning 30583 1726853720.12676: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000010fc] 30583 1726853720.12681: sending task result for task 02083763-bbaf-05ea-abc5-0000000010fc 30583 1726853720.12778: done sending task result for task 02083763-bbaf-05ea-abc5-0000000010fc 30583 1726853720.12780: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853720.12833: no more pending results, returning what we have 30583 1726853720.12837: results queue empty 30583 1726853720.12838: checking for any_errors_fatal 30583 1726853720.12844: done checking for any_errors_fatal 30583 1726853720.12844: checking for max_fail_percentage 30583 1726853720.12846: done checking for max_fail_percentage 30583 1726853720.12847: checking to see if all hosts have failed and the running result is not ok 30583 1726853720.12848: done checking to see if all hosts have failed 30583 1726853720.12848: getting the remaining hosts for this loop 30583 1726853720.12850: done getting the remaining hosts for this loop 30583 1726853720.12854: getting the next task for host managed_node2 30583 1726853720.12864: done getting next task for host managed_node2 30583 1726853720.12868: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853720.12875: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853720.12897: getting variables 30583 1726853720.12898: in VariableManager get_vars() 30583 1726853720.12937: Calling all_inventory to load vars for managed_node2 30583 1726853720.12940: Calling groups_inventory to load vars for managed_node2 30583 1726853720.12942: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853720.12951: Calling all_plugins_play to load vars for managed_node2 30583 1726853720.12953: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853720.12956: Calling groups_plugins_play to load vars for managed_node2 30583 1726853720.13796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853720.14685: done with get_vars() 30583 1726853720.14705: done getting variables 30583 1726853720.14750: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:20 -0400 (0:00:00.074) 0:00:55.485 ****** 30583 1726853720.14782: entering _queue_task() for managed_node2/package 30583 1726853720.15053: worker is 1 (out of 1 available) 30583 1726853720.15072: exiting _queue_task() for managed_node2/package 30583 1726853720.15085: done queuing things up, now waiting for results queue to drain 30583 1726853720.15086: waiting for pending results... 30583 1726853720.15275: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853720.15376: in run() - task 02083763-bbaf-05ea-abc5-0000000010fd 30583 1726853720.15387: variable 'ansible_search_path' from source: unknown 30583 1726853720.15391: variable 'ansible_search_path' from source: unknown 30583 1726853720.15424: calling self._execute() 30583 1726853720.15495: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853720.15499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853720.15508: variable 'omit' from source: magic vars 30583 1726853720.15785: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.15794: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853720.15928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853720.16125: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853720.16156: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853720.16190: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853720.16238: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853720.16318: variable 'network_packages' from source: role '' defaults 30583 1726853720.16390: variable '__network_provider_setup' from source: role '' defaults 30583 1726853720.16404: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853720.16445: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853720.16453: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853720.16498: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853720.16611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853720.18148: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853720.18192: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853720.18217: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853720.18241: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853720.18265: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853720.18322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.18342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.18366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.18391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.18402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.18433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.18448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.18465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.18496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.18506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.18659: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853720.18734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.18750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.18768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.18793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.18808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.18867: variable 'ansible_python' from source: facts 30583 1726853720.18881: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853720.18938: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853720.18993: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853720.19076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.19093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.19111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.19139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.19149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.19186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.19205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.19221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.19250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.19260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.19355: variable 'network_connections' from source: include params 30583 1726853720.19358: variable 'interface' from source: play vars 30583 1726853720.19432: variable 'interface' from source: play vars 30583 1726853720.19488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853720.19507: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853720.19527: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.19548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853720.19589: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853720.19760: variable 'network_connections' from source: include params 30583 1726853720.19766: variable 'interface' from source: play vars 30583 1726853720.19838: variable 'interface' from source: play vars 30583 1726853720.19880: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853720.19934: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853720.20131: variable 'network_connections' from source: include params 30583 1726853720.20134: variable 'interface' from source: play vars 30583 1726853720.20181: variable 'interface' from source: play vars 30583 1726853720.20198: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853720.20254: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853720.20449: variable 'network_connections' from source: include params 30583 1726853720.20453: variable 'interface' from source: play vars 30583 1726853720.20501: variable 'interface' from source: play vars 30583 1726853720.20603: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853720.20613: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853720.20656: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853720.20685: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853720.20819: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853720.21184: variable 'network_connections' from source: include params 30583 1726853720.21193: variable 'interface' from source: play vars 30583 1726853720.21261: variable 'interface' from source: play vars 30583 1726853720.21268: variable 'ansible_distribution' from source: facts 30583 1726853720.21272: variable '__network_rh_distros' from source: role '' defaults 30583 1726853720.21275: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.21293: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853720.21406: variable 'ansible_distribution' from source: facts 30583 1726853720.21410: variable '__network_rh_distros' from source: role '' defaults 30583 1726853720.21412: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.21422: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853720.21681: variable 'ansible_distribution' from source: facts 30583 1726853720.21687: variable '__network_rh_distros' from source: role '' defaults 30583 1726853720.21693: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.21696: variable 'network_provider' from source: set_fact 30583 1726853720.21702: variable 'ansible_facts' from source: unknown 30583 1726853720.22531: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853720.22538: when evaluation is False, skipping this task 30583 1726853720.22541: _execute() done 30583 1726853720.22543: dumping result to json 30583 1726853720.22545: done dumping result, returning 30583 1726853720.22604: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-0000000010fd] 30583 1726853720.22607: sending task result for task 02083763-bbaf-05ea-abc5-0000000010fd 30583 1726853720.22689: done sending task result for task 02083763-bbaf-05ea-abc5-0000000010fd 30583 1726853720.22691: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853720.22748: no more pending results, returning what we have 30583 1726853720.22751: results queue empty 30583 1726853720.22752: checking for any_errors_fatal 30583 1726853720.22760: done checking for any_errors_fatal 30583 1726853720.22761: checking for max_fail_percentage 30583 1726853720.22763: done checking for max_fail_percentage 30583 1726853720.22764: checking to see if all hosts have failed and the running result is not ok 30583 1726853720.22765: done checking to see if all hosts have failed 30583 1726853720.22765: getting the remaining hosts for this loop 30583 1726853720.22767: done getting the remaining hosts for this loop 30583 1726853720.22773: getting the next task for host managed_node2 30583 1726853720.22783: done getting next task for host managed_node2 30583 1726853720.22787: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853720.23015: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853720.23036: getting variables 30583 1726853720.23037: in VariableManager get_vars() 30583 1726853720.23078: Calling all_inventory to load vars for managed_node2 30583 1726853720.23199: Calling groups_inventory to load vars for managed_node2 30583 1726853720.23203: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853720.23212: Calling all_plugins_play to load vars for managed_node2 30583 1726853720.23215: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853720.23217: Calling groups_plugins_play to load vars for managed_node2 30583 1726853720.24993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853720.26341: done with get_vars() 30583 1726853720.26365: done getting variables 30583 1726853720.26427: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:20 -0400 (0:00:00.116) 0:00:55.601 ****** 30583 1726853720.26465: entering _queue_task() for managed_node2/package 30583 1726853720.26827: worker is 1 (out of 1 available) 30583 1726853720.26838: exiting _queue_task() for managed_node2/package 30583 1726853720.26850: done queuing things up, now waiting for results queue to drain 30583 1726853720.26851: waiting for pending results... 30583 1726853720.27590: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853720.27596: in run() - task 02083763-bbaf-05ea-abc5-0000000010fe 30583 1726853720.27598: variable 'ansible_search_path' from source: unknown 30583 1726853720.27601: variable 'ansible_search_path' from source: unknown 30583 1726853720.27603: calling self._execute() 30583 1726853720.27778: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853720.27782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853720.27787: variable 'omit' from source: magic vars 30583 1726853720.28430: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.28441: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853720.28735: variable 'network_state' from source: role '' defaults 30583 1726853720.28978: Evaluated conditional (network_state != {}): False 30583 1726853720.28982: when evaluation is False, skipping this task 30583 1726853720.28985: _execute() done 30583 1726853720.28987: dumping result to json 30583 1726853720.28989: done dumping result, returning 30583 1726853720.28992: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-0000000010fe] 30583 1726853720.28995: sending task result for task 02083763-bbaf-05ea-abc5-0000000010fe 30583 1726853720.29075: done sending task result for task 02083763-bbaf-05ea-abc5-0000000010fe 30583 1726853720.29080: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853720.29127: no more pending results, returning what we have 30583 1726853720.29131: results queue empty 30583 1726853720.29132: checking for any_errors_fatal 30583 1726853720.29139: done checking for any_errors_fatal 30583 1726853720.29139: checking for max_fail_percentage 30583 1726853720.29141: done checking for max_fail_percentage 30583 1726853720.29142: checking to see if all hosts have failed and the running result is not ok 30583 1726853720.29143: done checking to see if all hosts have failed 30583 1726853720.29143: getting the remaining hosts for this loop 30583 1726853720.29145: done getting the remaining hosts for this loop 30583 1726853720.29149: getting the next task for host managed_node2 30583 1726853720.29160: done getting next task for host managed_node2 30583 1726853720.29164: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853720.29170: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853720.29196: getting variables 30583 1726853720.29198: in VariableManager get_vars() 30583 1726853720.29239: Calling all_inventory to load vars for managed_node2 30583 1726853720.29242: Calling groups_inventory to load vars for managed_node2 30583 1726853720.29245: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853720.29256: Calling all_plugins_play to load vars for managed_node2 30583 1726853720.29261: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853720.29264: Calling groups_plugins_play to load vars for managed_node2 30583 1726853720.30938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853720.34631: done with get_vars() 30583 1726853720.34655: done getting variables 30583 1726853720.34717: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:20 -0400 (0:00:00.082) 0:00:55.684 ****** 30583 1726853720.34754: entering _queue_task() for managed_node2/package 30583 1726853720.35801: worker is 1 (out of 1 available) 30583 1726853720.35812: exiting _queue_task() for managed_node2/package 30583 1726853720.35823: done queuing things up, now waiting for results queue to drain 30583 1726853720.35824: waiting for pending results... 30583 1726853720.36289: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853720.36348: in run() - task 02083763-bbaf-05ea-abc5-0000000010ff 30583 1726853720.36425: variable 'ansible_search_path' from source: unknown 30583 1726853720.36624: variable 'ansible_search_path' from source: unknown 30583 1726853720.36627: calling self._execute() 30583 1726853720.36709: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853720.36848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853720.36863: variable 'omit' from source: magic vars 30583 1726853720.37465: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.37612: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853720.37818: variable 'network_state' from source: role '' defaults 30583 1726853720.37837: Evaluated conditional (network_state != {}): False 30583 1726853720.37932: when evaluation is False, skipping this task 30583 1726853720.37939: _execute() done 30583 1726853720.37945: dumping result to json 30583 1726853720.37951: done dumping result, returning 30583 1726853720.37961: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-0000000010ff] 30583 1726853720.37969: sending task result for task 02083763-bbaf-05ea-abc5-0000000010ff 30583 1726853720.38376: done sending task result for task 02083763-bbaf-05ea-abc5-0000000010ff 30583 1726853720.38380: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853720.38429: no more pending results, returning what we have 30583 1726853720.38433: results queue empty 30583 1726853720.38435: checking for any_errors_fatal 30583 1726853720.38443: done checking for any_errors_fatal 30583 1726853720.38444: checking for max_fail_percentage 30583 1726853720.38447: done checking for max_fail_percentage 30583 1726853720.38448: checking to see if all hosts have failed and the running result is not ok 30583 1726853720.38449: done checking to see if all hosts have failed 30583 1726853720.38449: getting the remaining hosts for this loop 30583 1726853720.38451: done getting the remaining hosts for this loop 30583 1726853720.38456: getting the next task for host managed_node2 30583 1726853720.38465: done getting next task for host managed_node2 30583 1726853720.38470: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853720.38478: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853720.38503: getting variables 30583 1726853720.38505: in VariableManager get_vars() 30583 1726853720.38546: Calling all_inventory to load vars for managed_node2 30583 1726853720.38550: Calling groups_inventory to load vars for managed_node2 30583 1726853720.38552: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853720.38564: Calling all_plugins_play to load vars for managed_node2 30583 1726853720.38568: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853720.38935: Calling groups_plugins_play to load vars for managed_node2 30583 1726853720.41440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853720.44677: done with get_vars() 30583 1726853720.44713: done getting variables 30583 1726853720.44980: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:20 -0400 (0:00:00.102) 0:00:55.787 ****** 30583 1726853720.45022: entering _queue_task() for managed_node2/service 30583 1726853720.46007: worker is 1 (out of 1 available) 30583 1726853720.46018: exiting _queue_task() for managed_node2/service 30583 1726853720.46028: done queuing things up, now waiting for results queue to drain 30583 1726853720.46030: waiting for pending results... 30583 1726853720.46258: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853720.46679: in run() - task 02083763-bbaf-05ea-abc5-000000001100 30583 1726853720.46683: variable 'ansible_search_path' from source: unknown 30583 1726853720.46686: variable 'ansible_search_path' from source: unknown 30583 1726853720.46881: calling self._execute() 30583 1726853720.46942: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853720.47030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853720.47046: variable 'omit' from source: magic vars 30583 1726853720.47977: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.47980: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853720.48116: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853720.48478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853720.53201: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853720.53410: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853720.53499: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853720.53613: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853720.53645: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853720.53939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.53978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.54009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.54049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.54066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.54116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.54300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.54379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.54393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.54413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.54521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.54549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.54626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.54745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.54764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.55358: variable 'network_connections' from source: include params 30583 1726853720.55361: variable 'interface' from source: play vars 30583 1726853720.55363: variable 'interface' from source: play vars 30583 1726853720.55537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853720.55838: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853720.56821: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853720.56854: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853720.57078: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853720.57082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853720.57376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853720.57379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.57382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853720.57384: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853720.57832: variable 'network_connections' from source: include params 30583 1726853720.57845: variable 'interface' from source: play vars 30583 1726853720.57922: variable 'interface' from source: play vars 30583 1726853720.57965: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853720.57976: when evaluation is False, skipping this task 30583 1726853720.57984: _execute() done 30583 1726853720.57990: dumping result to json 30583 1726853720.57997: done dumping result, returning 30583 1726853720.58009: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001100] 30583 1726853720.58018: sending task result for task 02083763-bbaf-05ea-abc5-000000001100 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853720.58206: no more pending results, returning what we have 30583 1726853720.58210: results queue empty 30583 1726853720.58212: checking for any_errors_fatal 30583 1726853720.58218: done checking for any_errors_fatal 30583 1726853720.58219: checking for max_fail_percentage 30583 1726853720.58222: done checking for max_fail_percentage 30583 1726853720.58223: checking to see if all hosts have failed and the running result is not ok 30583 1726853720.58224: done checking to see if all hosts have failed 30583 1726853720.58224: getting the remaining hosts for this loop 30583 1726853720.58227: done getting the remaining hosts for this loop 30583 1726853720.58231: getting the next task for host managed_node2 30583 1726853720.58242: done getting next task for host managed_node2 30583 1726853720.58247: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853720.58252: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853720.58277: getting variables 30583 1726853720.58279: in VariableManager get_vars() 30583 1726853720.58323: Calling all_inventory to load vars for managed_node2 30583 1726853720.58326: Calling groups_inventory to load vars for managed_node2 30583 1726853720.58329: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853720.58340: Calling all_plugins_play to load vars for managed_node2 30583 1726853720.58344: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853720.58347: Calling groups_plugins_play to load vars for managed_node2 30583 1726853720.59349: done sending task result for task 02083763-bbaf-05ea-abc5-000000001100 30583 1726853720.59353: WORKER PROCESS EXITING 30583 1726853720.60960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853720.62739: done with get_vars() 30583 1726853720.62976: done getting variables 30583 1726853720.63054: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:20 -0400 (0:00:00.181) 0:00:55.968 ****** 30583 1726853720.63125: entering _queue_task() for managed_node2/service 30583 1726853720.63635: worker is 1 (out of 1 available) 30583 1726853720.63648: exiting _queue_task() for managed_node2/service 30583 1726853720.63661: done queuing things up, now waiting for results queue to drain 30583 1726853720.63662: waiting for pending results... 30583 1726853720.63962: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853720.64110: in run() - task 02083763-bbaf-05ea-abc5-000000001101 30583 1726853720.64124: variable 'ansible_search_path' from source: unknown 30583 1726853720.64128: variable 'ansible_search_path' from source: unknown 30583 1726853720.64168: calling self._execute() 30583 1726853720.64279: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853720.64285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853720.64295: variable 'omit' from source: magic vars 30583 1726853720.64693: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.64704: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853720.64880: variable 'network_provider' from source: set_fact 30583 1726853720.64884: variable 'network_state' from source: role '' defaults 30583 1726853720.64895: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853720.64901: variable 'omit' from source: magic vars 30583 1726853720.64967: variable 'omit' from source: magic vars 30583 1726853720.64996: variable 'network_service_name' from source: role '' defaults 30583 1726853720.65060: variable 'network_service_name' from source: role '' defaults 30583 1726853720.65179: variable '__network_provider_setup' from source: role '' defaults 30583 1726853720.65185: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853720.65245: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853720.65285: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853720.65323: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853720.65657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853720.68497: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853720.68605: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853720.68609: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853720.68642: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853720.68669: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853720.68748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.68820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.68823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.68844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.68857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.68904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.68928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.68952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.68992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.69036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.69243: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853720.69361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.69390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.69411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.69685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.69688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.69888: variable 'ansible_python' from source: facts 30583 1726853720.69904: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853720.70119: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853720.70200: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853720.70475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.70600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.70662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.70665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.70684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.70734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853720.70775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853720.70805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.70929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853720.70933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853720.71376: variable 'network_connections' from source: include params 30583 1726853720.71380: variable 'interface' from source: play vars 30583 1726853720.71382: variable 'interface' from source: play vars 30583 1726853720.71496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853720.71776: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853720.71780: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853720.71800: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853720.71844: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853720.71906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853720.71933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853720.71973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853720.72006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853720.72051: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853720.72423: variable 'network_connections' from source: include params 30583 1726853720.72502: variable 'interface' from source: play vars 30583 1726853720.72505: variable 'interface' from source: play vars 30583 1726853720.72622: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853720.72704: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853720.73012: variable 'network_connections' from source: include params 30583 1726853720.73015: variable 'interface' from source: play vars 30583 1726853720.73094: variable 'interface' from source: play vars 30583 1726853720.73119: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853720.73202: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853720.73512: variable 'network_connections' from source: include params 30583 1726853720.73515: variable 'interface' from source: play vars 30583 1726853720.73595: variable 'interface' from source: play vars 30583 1726853720.73654: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853720.73721: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853720.73729: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853720.73791: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853720.74075: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853720.74694: variable 'network_connections' from source: include params 30583 1726853720.74698: variable 'interface' from source: play vars 30583 1726853720.74755: variable 'interface' from source: play vars 30583 1726853720.74765: variable 'ansible_distribution' from source: facts 30583 1726853720.74768: variable '__network_rh_distros' from source: role '' defaults 30583 1726853720.74783: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.74959: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853720.74995: variable 'ansible_distribution' from source: facts 30583 1726853720.75004: variable '__network_rh_distros' from source: role '' defaults 30583 1726853720.75009: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.75019: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853720.75287: variable 'ansible_distribution' from source: facts 30583 1726853720.75290: variable '__network_rh_distros' from source: role '' defaults 30583 1726853720.75293: variable 'ansible_distribution_major_version' from source: facts 30583 1726853720.75295: variable 'network_provider' from source: set_fact 30583 1726853720.75297: variable 'omit' from source: magic vars 30583 1726853720.75299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853720.75316: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853720.75339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853720.75355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853720.75368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853720.75457: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853720.75463: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853720.75466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853720.75569: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853720.75578: Set connection var ansible_timeout to 10 30583 1726853720.75581: Set connection var ansible_connection to ssh 30583 1726853720.75587: Set connection var ansible_shell_executable to /bin/sh 30583 1726853720.75589: Set connection var ansible_shell_type to sh 30583 1726853720.75599: Set connection var ansible_pipelining to False 30583 1726853720.75624: variable 'ansible_shell_executable' from source: unknown 30583 1726853720.75627: variable 'ansible_connection' from source: unknown 30583 1726853720.75630: variable 'ansible_module_compression' from source: unknown 30583 1726853720.75632: variable 'ansible_shell_type' from source: unknown 30583 1726853720.75634: variable 'ansible_shell_executable' from source: unknown 30583 1726853720.75636: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853720.75641: variable 'ansible_pipelining' from source: unknown 30583 1726853720.75643: variable 'ansible_timeout' from source: unknown 30583 1726853720.75647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853720.75826: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853720.75914: variable 'omit' from source: magic vars 30583 1726853720.75917: starting attempt loop 30583 1726853720.75920: running the handler 30583 1726853720.75951: variable 'ansible_facts' from source: unknown 30583 1726853720.76500: _low_level_execute_command(): starting 30583 1726853720.76504: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853720.77009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853720.77012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853720.77016: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853720.77018: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853720.77056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853720.77074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853720.77158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853720.78893: stdout chunk (state=3): >>>/root <<< 30583 1726853720.78992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853720.79020: stderr chunk (state=3): >>><<< 30583 1726853720.79023: stdout chunk (state=3): >>><<< 30583 1726853720.79040: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853720.79050: _low_level_execute_command(): starting 30583 1726853720.79056: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410 `" && echo ansible-tmp-1726853720.7904034-33280-223551704001410="` echo /root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410 `" ) && sleep 0' 30583 1726853720.79454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853720.79488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853720.79491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853720.79494: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853720.79496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853720.79537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853720.79544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853720.79626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853720.81602: stdout chunk (state=3): >>>ansible-tmp-1726853720.7904034-33280-223551704001410=/root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410 <<< 30583 1726853720.81710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853720.81731: stderr chunk (state=3): >>><<< 30583 1726853720.81735: stdout chunk (state=3): >>><<< 30583 1726853720.81748: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853720.7904034-33280-223551704001410=/root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853720.81774: variable 'ansible_module_compression' from source: unknown 30583 1726853720.81814: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853720.81857: variable 'ansible_facts' from source: unknown 30583 1726853720.81988: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410/AnsiballZ_systemd.py 30583 1726853720.82087: Sending initial data 30583 1726853720.82090: Sent initial data (156 bytes) 30583 1726853720.82514: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853720.82517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853720.82523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853720.82525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853720.82527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853720.82575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853720.82579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853720.82655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853720.84332: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30583 1726853720.84336: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853720.84400: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853720.84469: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpajo6ij_b /root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410/AnsiballZ_systemd.py <<< 30583 1726853720.84477: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410/AnsiballZ_systemd.py" <<< 30583 1726853720.84537: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpajo6ij_b" to remote "/root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410/AnsiballZ_systemd.py" <<< 30583 1726853720.84540: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410/AnsiballZ_systemd.py" <<< 30583 1726853720.86076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853720.86079: stderr chunk (state=3): >>><<< 30583 1726853720.86080: stdout chunk (state=3): >>><<< 30583 1726853720.86100: done transferring module to remote 30583 1726853720.86110: _low_level_execute_command(): starting 30583 1726853720.86118: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410/ /root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410/AnsiballZ_systemd.py && sleep 0' 30583 1726853720.86664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853720.86680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853720.86691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853720.86742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853720.86745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853720.86821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853720.88737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853720.88762: stderr chunk (state=3): >>><<< 30583 1726853720.88766: stdout chunk (state=3): >>><<< 30583 1726853720.88782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853720.88785: _low_level_execute_command(): starting 30583 1726853720.88789: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410/AnsiballZ_systemd.py && sleep 0' 30583 1726853720.89218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853720.89221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853720.89224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853720.89226: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853720.89228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853720.89275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853720.89279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853720.89360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853721.19227: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4599808", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310243840", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1854640000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853721.21133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853721.21137: stderr chunk (state=3): >>><<< 30583 1726853721.21140: stdout chunk (state=3): >>><<< 30583 1726853721.21478: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4599808", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310243840", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1854640000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853721.21491: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853721.21494: _low_level_execute_command(): starting 30583 1726853721.21680: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853720.7904034-33280-223551704001410/ > /dev/null 2>&1 && sleep 0' 30583 1726853721.22716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853721.22889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853721.22987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853721.23224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853721.23299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853721.25252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853721.25266: stdout chunk (state=3): >>><<< 30583 1726853721.25281: stderr chunk (state=3): >>><<< 30583 1726853721.25388: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853721.25402: handler run complete 30583 1726853721.25474: attempt loop complete, returning result 30583 1726853721.25776: _execute() done 30583 1726853721.25779: dumping result to json 30583 1726853721.25782: done dumping result, returning 30583 1726853721.25785: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-000000001101] 30583 1726853721.25788: sending task result for task 02083763-bbaf-05ea-abc5-000000001101 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853721.26118: no more pending results, returning what we have 30583 1726853721.26121: results queue empty 30583 1726853721.26122: checking for any_errors_fatal 30583 1726853721.26127: done checking for any_errors_fatal 30583 1726853721.26128: checking for max_fail_percentage 30583 1726853721.26130: done checking for max_fail_percentage 30583 1726853721.26131: checking to see if all hosts have failed and the running result is not ok 30583 1726853721.26131: done checking to see if all hosts have failed 30583 1726853721.26132: getting the remaining hosts for this loop 30583 1726853721.26134: done getting the remaining hosts for this loop 30583 1726853721.26138: getting the next task for host managed_node2 30583 1726853721.26149: done getting next task for host managed_node2 30583 1726853721.26154: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853721.26159: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853721.26176: getting variables 30583 1726853721.26178: in VariableManager get_vars() 30583 1726853721.26330: Calling all_inventory to load vars for managed_node2 30583 1726853721.26334: Calling groups_inventory to load vars for managed_node2 30583 1726853721.26336: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853721.26346: Calling all_plugins_play to load vars for managed_node2 30583 1726853721.26348: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853721.26351: Calling groups_plugins_play to load vars for managed_node2 30583 1726853721.27679: done sending task result for task 02083763-bbaf-05ea-abc5-000000001101 30583 1726853721.27683: WORKER PROCESS EXITING 30583 1726853721.28663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853721.30443: done with get_vars() 30583 1726853721.30475: done getting variables 30583 1726853721.30543: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:35:21 -0400 (0:00:00.674) 0:00:56.643 ****** 30583 1726853721.30590: entering _queue_task() for managed_node2/service 30583 1726853721.31095: worker is 1 (out of 1 available) 30583 1726853721.31106: exiting _queue_task() for managed_node2/service 30583 1726853721.31118: done queuing things up, now waiting for results queue to drain 30583 1726853721.31119: waiting for pending results... 30583 1726853721.31345: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853721.31502: in run() - task 02083763-bbaf-05ea-abc5-000000001102 30583 1726853721.31543: variable 'ansible_search_path' from source: unknown 30583 1726853721.31547: variable 'ansible_search_path' from source: unknown 30583 1726853721.31592: calling self._execute() 30583 1726853721.31761: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853721.31769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853721.31783: variable 'omit' from source: magic vars 30583 1726853721.32329: variable 'ansible_distribution_major_version' from source: facts 30583 1726853721.32340: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853721.32581: variable 'network_provider' from source: set_fact 30583 1726853721.32585: Evaluated conditional (network_provider == "nm"): True 30583 1726853721.32588: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853721.32665: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853721.32855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853721.35878: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853721.35910: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853721.35945: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853721.35987: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853721.36018: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853721.36104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853721.36332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853721.36776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853721.36779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853721.36782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853721.36784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853721.36787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853721.36799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853721.36839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853721.36854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853721.37128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853721.37144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853721.37175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853721.37346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853721.37350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853721.37616: variable 'network_connections' from source: include params 30583 1726853721.37628: variable 'interface' from source: play vars 30583 1726853721.37817: variable 'interface' from source: play vars 30583 1726853721.38045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853721.38475: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853721.38630: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853721.38678: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853721.38706: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853721.38881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853721.38924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853721.38935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853721.39115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853721.39295: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853721.40037: variable 'network_connections' from source: include params 30583 1726853721.40041: variable 'interface' from source: play vars 30583 1726853721.40208: variable 'interface' from source: play vars 30583 1726853721.40365: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853721.40368: when evaluation is False, skipping this task 30583 1726853721.40372: _execute() done 30583 1726853721.40375: dumping result to json 30583 1726853721.40476: done dumping result, returning 30583 1726853721.40480: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-000000001102] 30583 1726853721.40505: sending task result for task 02083763-bbaf-05ea-abc5-000000001102 30583 1726853721.40795: done sending task result for task 02083763-bbaf-05ea-abc5-000000001102 30583 1726853721.40801: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853721.40856: no more pending results, returning what we have 30583 1726853721.40860: results queue empty 30583 1726853721.40861: checking for any_errors_fatal 30583 1726853721.40888: done checking for any_errors_fatal 30583 1726853721.40890: checking for max_fail_percentage 30583 1726853721.40892: done checking for max_fail_percentage 30583 1726853721.40893: checking to see if all hosts have failed and the running result is not ok 30583 1726853721.40894: done checking to see if all hosts have failed 30583 1726853721.40894: getting the remaining hosts for this loop 30583 1726853721.40896: done getting the remaining hosts for this loop 30583 1726853721.40900: getting the next task for host managed_node2 30583 1726853721.40909: done getting next task for host managed_node2 30583 1726853721.40913: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853721.40919: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853721.40939: getting variables 30583 1726853721.40941: in VariableManager get_vars() 30583 1726853721.41183: Calling all_inventory to load vars for managed_node2 30583 1726853721.41186: Calling groups_inventory to load vars for managed_node2 30583 1726853721.41189: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853721.41198: Calling all_plugins_play to load vars for managed_node2 30583 1726853721.41201: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853721.41204: Calling groups_plugins_play to load vars for managed_node2 30583 1726853721.44426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853721.48210: done with get_vars() 30583 1726853721.48381: done getting variables 30583 1726853721.48531: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:35:21 -0400 (0:00:00.180) 0:00:56.823 ****** 30583 1726853721.48602: entering _queue_task() for managed_node2/service 30583 1726853721.49508: worker is 1 (out of 1 available) 30583 1726853721.49526: exiting _queue_task() for managed_node2/service 30583 1726853721.49541: done queuing things up, now waiting for results queue to drain 30583 1726853721.49543: waiting for pending results... 30583 1726853721.49896: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853721.50023: in run() - task 02083763-bbaf-05ea-abc5-000000001103 30583 1726853721.50042: variable 'ansible_search_path' from source: unknown 30583 1726853721.50052: variable 'ansible_search_path' from source: unknown 30583 1726853721.50100: calling self._execute() 30583 1726853721.50208: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853721.50219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853721.50234: variable 'omit' from source: magic vars 30583 1726853721.50601: variable 'ansible_distribution_major_version' from source: facts 30583 1726853721.50617: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853721.50735: variable 'network_provider' from source: set_fact 30583 1726853721.50746: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853721.50753: when evaluation is False, skipping this task 30583 1726853721.50760: _execute() done 30583 1726853721.50767: dumping result to json 30583 1726853721.50777: done dumping result, returning 30583 1726853721.50787: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-000000001103] 30583 1726853721.50796: sending task result for task 02083763-bbaf-05ea-abc5-000000001103 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853721.50960: no more pending results, returning what we have 30583 1726853721.50964: results queue empty 30583 1726853721.50965: checking for any_errors_fatal 30583 1726853721.50979: done checking for any_errors_fatal 30583 1726853721.50980: checking for max_fail_percentage 30583 1726853721.50981: done checking for max_fail_percentage 30583 1726853721.50982: checking to see if all hosts have failed and the running result is not ok 30583 1726853721.50983: done checking to see if all hosts have failed 30583 1726853721.50984: getting the remaining hosts for this loop 30583 1726853721.50985: done getting the remaining hosts for this loop 30583 1726853721.50989: getting the next task for host managed_node2 30583 1726853721.50998: done getting next task for host managed_node2 30583 1726853721.51002: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853721.51007: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853721.51041: getting variables 30583 1726853721.51043: in VariableManager get_vars() 30583 1726853721.51148: Calling all_inventory to load vars for managed_node2 30583 1726853721.51150: Calling groups_inventory to load vars for managed_node2 30583 1726853721.51152: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853721.51165: Calling all_plugins_play to load vars for managed_node2 30583 1726853721.51167: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853721.51172: Calling groups_plugins_play to load vars for managed_node2 30583 1726853721.51689: done sending task result for task 02083763-bbaf-05ea-abc5-000000001103 30583 1726853721.51693: WORKER PROCESS EXITING 30583 1726853721.52746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853721.55709: done with get_vars() 30583 1726853721.55733: done getting variables 30583 1726853721.56000: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:35:21 -0400 (0:00:00.074) 0:00:56.897 ****** 30583 1726853721.56036: entering _queue_task() for managed_node2/copy 30583 1726853721.56618: worker is 1 (out of 1 available) 30583 1726853721.56632: exiting _queue_task() for managed_node2/copy 30583 1726853721.56645: done queuing things up, now waiting for results queue to drain 30583 1726853721.56647: waiting for pending results... 30583 1726853721.57291: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853721.57312: in run() - task 02083763-bbaf-05ea-abc5-000000001104 30583 1726853721.57441: variable 'ansible_search_path' from source: unknown 30583 1726853721.57445: variable 'ansible_search_path' from source: unknown 30583 1726853721.57485: calling self._execute() 30583 1726853721.57736: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853721.57740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853721.57750: variable 'omit' from source: magic vars 30583 1726853721.58552: variable 'ansible_distribution_major_version' from source: facts 30583 1726853721.58567: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853721.58967: variable 'network_provider' from source: set_fact 30583 1726853721.58973: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853721.58976: when evaluation is False, skipping this task 30583 1726853721.58979: _execute() done 30583 1726853721.58982: dumping result to json 30583 1726853721.58986: done dumping result, returning 30583 1726853721.58996: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-000000001104] 30583 1726853721.58999: sending task result for task 02083763-bbaf-05ea-abc5-000000001104 30583 1726853721.59113: done sending task result for task 02083763-bbaf-05ea-abc5-000000001104 30583 1726853721.59116: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853721.59192: no more pending results, returning what we have 30583 1726853721.59196: results queue empty 30583 1726853721.59197: checking for any_errors_fatal 30583 1726853721.59201: done checking for any_errors_fatal 30583 1726853721.59202: checking for max_fail_percentage 30583 1726853721.59204: done checking for max_fail_percentage 30583 1726853721.59205: checking to see if all hosts have failed and the running result is not ok 30583 1726853721.59206: done checking to see if all hosts have failed 30583 1726853721.59206: getting the remaining hosts for this loop 30583 1726853721.59208: done getting the remaining hosts for this loop 30583 1726853721.59211: getting the next task for host managed_node2 30583 1726853721.59221: done getting next task for host managed_node2 30583 1726853721.59225: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853721.59230: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853721.59252: getting variables 30583 1726853721.59254: in VariableManager get_vars() 30583 1726853721.59299: Calling all_inventory to load vars for managed_node2 30583 1726853721.59302: Calling groups_inventory to load vars for managed_node2 30583 1726853721.59304: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853721.59317: Calling all_plugins_play to load vars for managed_node2 30583 1726853721.59320: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853721.59323: Calling groups_plugins_play to load vars for managed_node2 30583 1726853721.61944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853721.65108: done with get_vars() 30583 1726853721.65139: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:35:21 -0400 (0:00:00.091) 0:00:56.989 ****** 30583 1726853721.65230: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853721.66105: worker is 1 (out of 1 available) 30583 1726853721.66118: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853721.66132: done queuing things up, now waiting for results queue to drain 30583 1726853721.66134: waiting for pending results... 30583 1726853721.66818: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853721.67003: in run() - task 02083763-bbaf-05ea-abc5-000000001105 30583 1726853721.67019: variable 'ansible_search_path' from source: unknown 30583 1726853721.67026: variable 'ansible_search_path' from source: unknown 30583 1726853721.67169: calling self._execute() 30583 1726853721.67276: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853721.67396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853721.67409: variable 'omit' from source: magic vars 30583 1726853721.68196: variable 'ansible_distribution_major_version' from source: facts 30583 1726853721.68208: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853721.68217: variable 'omit' from source: magic vars 30583 1726853721.68397: variable 'omit' from source: magic vars 30583 1726853721.68764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853721.73579: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853721.73584: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853721.73587: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853721.73589: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853721.73591: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853721.73817: variable 'network_provider' from source: set_fact 30583 1726853721.73956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853721.73989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853721.74016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853721.74061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853721.74080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853721.74160: variable 'omit' from source: magic vars 30583 1726853721.74278: variable 'omit' from source: magic vars 30583 1726853721.74387: variable 'network_connections' from source: include params 30583 1726853721.74401: variable 'interface' from source: play vars 30583 1726853721.74466: variable 'interface' from source: play vars 30583 1726853721.74628: variable 'omit' from source: magic vars 30583 1726853721.74642: variable '__lsr_ansible_managed' from source: task vars 30583 1726853721.74708: variable '__lsr_ansible_managed' from source: task vars 30583 1726853721.75255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853721.75463: Loaded config def from plugin (lookup/template) 30583 1726853721.75474: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853721.75505: File lookup term: get_ansible_managed.j2 30583 1726853721.75513: variable 'ansible_search_path' from source: unknown 30583 1726853721.75522: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853721.75539: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853721.75565: variable 'ansible_search_path' from source: unknown 30583 1726853721.83996: variable 'ansible_managed' from source: unknown 30583 1726853721.84303: variable 'omit' from source: magic vars 30583 1726853721.84335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853721.84570: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853721.84574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853721.84577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853721.84579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853721.84581: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853721.84583: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853721.84585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853721.84749: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853721.84762: Set connection var ansible_timeout to 10 30583 1726853721.84789: Set connection var ansible_connection to ssh 30583 1726853721.84798: Set connection var ansible_shell_executable to /bin/sh 30583 1726853721.84804: Set connection var ansible_shell_type to sh 30583 1726853721.84976: Set connection var ansible_pipelining to False 30583 1726853721.84979: variable 'ansible_shell_executable' from source: unknown 30583 1726853721.84981: variable 'ansible_connection' from source: unknown 30583 1726853721.84983: variable 'ansible_module_compression' from source: unknown 30583 1726853721.84985: variable 'ansible_shell_type' from source: unknown 30583 1726853721.84987: variable 'ansible_shell_executable' from source: unknown 30583 1726853721.84989: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853721.84991: variable 'ansible_pipelining' from source: unknown 30583 1726853721.84993: variable 'ansible_timeout' from source: unknown 30583 1726853721.84996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853721.85195: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853721.85326: variable 'omit' from source: magic vars 30583 1726853721.85329: starting attempt loop 30583 1726853721.85332: running the handler 30583 1726853721.85334: _low_level_execute_command(): starting 30583 1726853721.85336: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853721.86966: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853721.86985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853721.87188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853721.87199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853721.87372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853721.87375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853721.87494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853721.87604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853721.89336: stdout chunk (state=3): >>>/root <<< 30583 1726853721.89492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853721.89876: stderr chunk (state=3): >>><<< 30583 1726853721.89879: stdout chunk (state=3): >>><<< 30583 1726853721.89882: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853721.89884: _low_level_execute_command(): starting 30583 1726853721.89888: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690 `" && echo ansible-tmp-1726853721.8960328-33334-213392645517690="` echo /root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690 `" ) && sleep 0' 30583 1726853721.91589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853721.91790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853721.91895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853721.93896: stdout chunk (state=3): >>>ansible-tmp-1726853721.8960328-33334-213392645517690=/root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690 <<< 30583 1726853721.93999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853721.94048: stderr chunk (state=3): >>><<< 30583 1726853721.94326: stdout chunk (state=3): >>><<< 30583 1726853721.94329: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853721.8960328-33334-213392645517690=/root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853721.94351: variable 'ansible_module_compression' from source: unknown 30583 1726853721.94399: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853721.94456: variable 'ansible_facts' from source: unknown 30583 1726853721.94813: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690/AnsiballZ_network_connections.py 30583 1726853721.95148: Sending initial data 30583 1726853721.95151: Sent initial data (168 bytes) 30583 1726853721.96346: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853721.96597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853721.96787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853721.96922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853721.98543: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853721.98609: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853721.98684: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpbvpnhmc8 /root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690/AnsiballZ_network_connections.py <<< 30583 1726853721.98693: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690/AnsiballZ_network_connections.py" <<< 30583 1726853721.98759: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpbvpnhmc8" to remote "/root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690/AnsiballZ_network_connections.py" <<< 30583 1726853722.01591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853722.01596: stdout chunk (state=3): >>><<< 30583 1726853722.01598: stderr chunk (state=3): >>><<< 30583 1726853722.01620: done transferring module to remote 30583 1726853722.01632: _low_level_execute_command(): starting 30583 1726853722.01637: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690/ /root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690/AnsiballZ_network_connections.py && sleep 0' 30583 1726853722.02888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853722.02985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853722.03124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853722.03161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853722.03308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853722.05236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853722.05247: stdout chunk (state=3): >>><<< 30583 1726853722.05468: stderr chunk (state=3): >>><<< 30583 1726853722.05474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853722.05477: _low_level_execute_command(): starting 30583 1726853722.05480: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690/AnsiballZ_network_connections.py && sleep 0' 30583 1726853722.06411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853722.06490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853722.06588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853722.06764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853722.06896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853722.06978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853722.35994: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 30583 1726853722.36201: stdout chunk (state=3): >>> <<< 30583 1726853722.39061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853722.39066: stdout chunk (state=3): >>><<< 30583 1726853722.39068: stderr chunk (state=3): >>><<< 30583 1726853722.39299: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853722.39303: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853722.39306: _low_level_execute_command(): starting 30583 1726853722.39308: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853721.8960328-33334-213392645517690/ > /dev/null 2>&1 && sleep 0' 30583 1726853722.40710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853722.40715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853722.40725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853722.40740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853722.40766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853722.40769: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853722.40774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853722.40869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853722.40874: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853722.40876: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853722.40878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853722.40880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853722.40882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853722.40884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853722.40886: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853722.40887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853722.41056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853722.41059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853722.41403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853722.43368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853722.43380: stdout chunk (state=3): >>><<< 30583 1726853722.43382: stderr chunk (state=3): >>><<< 30583 1726853722.43440: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853722.43443: handler run complete 30583 1726853722.43446: attempt loop complete, returning result 30583 1726853722.43448: _execute() done 30583 1726853722.43450: dumping result to json 30583 1726853722.43452: done dumping result, returning 30583 1726853722.43454: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-000000001105] 30583 1726853722.43459: sending task result for task 02083763-bbaf-05ea-abc5-000000001105 30583 1726853722.43676: done sending task result for task 02083763-bbaf-05ea-abc5-000000001105 changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840 30583 1726853722.43796: no more pending results, returning what we have 30583 1726853722.43800: results queue empty 30583 1726853722.43801: checking for any_errors_fatal 30583 1726853722.43808: done checking for any_errors_fatal 30583 1726853722.43809: checking for max_fail_percentage 30583 1726853722.43811: done checking for max_fail_percentage 30583 1726853722.43812: checking to see if all hosts have failed and the running result is not ok 30583 1726853722.43812: done checking to see if all hosts have failed 30583 1726853722.43813: getting the remaining hosts for this loop 30583 1726853722.43815: done getting the remaining hosts for this loop 30583 1726853722.43818: getting the next task for host managed_node2 30583 1726853722.43825: done getting next task for host managed_node2 30583 1726853722.43829: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853722.43833: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853722.43845: getting variables 30583 1726853722.43847: in VariableManager get_vars() 30583 1726853722.44096: Calling all_inventory to load vars for managed_node2 30583 1726853722.44101: Calling groups_inventory to load vars for managed_node2 30583 1726853722.44103: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853722.44113: Calling all_plugins_play to load vars for managed_node2 30583 1726853722.44115: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853722.44118: Calling groups_plugins_play to load vars for managed_node2 30583 1726853722.44861: WORKER PROCESS EXITING 30583 1726853722.47647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853722.51377: done with get_vars() 30583 1726853722.51415: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:35:22 -0400 (0:00:00.863) 0:00:57.853 ****** 30583 1726853722.51631: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853722.52378: worker is 1 (out of 1 available) 30583 1726853722.52394: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853722.52477: done queuing things up, now waiting for results queue to drain 30583 1726853722.52480: waiting for pending results... 30583 1726853722.53042: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853722.53177: in run() - task 02083763-bbaf-05ea-abc5-000000001106 30583 1726853722.53607: variable 'ansible_search_path' from source: unknown 30583 1726853722.53610: variable 'ansible_search_path' from source: unknown 30583 1726853722.53615: calling self._execute() 30583 1726853722.53632: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853722.53667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853722.53673: variable 'omit' from source: magic vars 30583 1726853722.54425: variable 'ansible_distribution_major_version' from source: facts 30583 1726853722.54428: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853722.54767: variable 'network_state' from source: role '' defaults 30583 1726853722.55010: Evaluated conditional (network_state != {}): False 30583 1726853722.55014: when evaluation is False, skipping this task 30583 1726853722.55016: _execute() done 30583 1726853722.55019: dumping result to json 30583 1726853722.55021: done dumping result, returning 30583 1726853722.55032: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-000000001106] 30583 1726853722.55035: sending task result for task 02083763-bbaf-05ea-abc5-000000001106 30583 1726853722.55132: done sending task result for task 02083763-bbaf-05ea-abc5-000000001106 30583 1726853722.55135: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853722.55196: no more pending results, returning what we have 30583 1726853722.55201: results queue empty 30583 1726853722.55202: checking for any_errors_fatal 30583 1726853722.55214: done checking for any_errors_fatal 30583 1726853722.55215: checking for max_fail_percentage 30583 1726853722.55217: done checking for max_fail_percentage 30583 1726853722.55218: checking to see if all hosts have failed and the running result is not ok 30583 1726853722.55219: done checking to see if all hosts have failed 30583 1726853722.55220: getting the remaining hosts for this loop 30583 1726853722.55221: done getting the remaining hosts for this loop 30583 1726853722.55225: getting the next task for host managed_node2 30583 1726853722.55234: done getting next task for host managed_node2 30583 1726853722.55239: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853722.55245: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853722.55377: getting variables 30583 1726853722.55379: in VariableManager get_vars() 30583 1726853722.55520: Calling all_inventory to load vars for managed_node2 30583 1726853722.55523: Calling groups_inventory to load vars for managed_node2 30583 1726853722.55525: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853722.55534: Calling all_plugins_play to load vars for managed_node2 30583 1726853722.55536: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853722.55538: Calling groups_plugins_play to load vars for managed_node2 30583 1726853722.58863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853722.60912: done with get_vars() 30583 1726853722.60945: done getting variables 30583 1726853722.61194: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:35:22 -0400 (0:00:00.096) 0:00:57.950 ****** 30583 1726853722.61319: entering _queue_task() for managed_node2/debug 30583 1726853722.62007: worker is 1 (out of 1 available) 30583 1726853722.62020: exiting _queue_task() for managed_node2/debug 30583 1726853722.62033: done queuing things up, now waiting for results queue to drain 30583 1726853722.62035: waiting for pending results... 30583 1726853722.62646: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853722.62651: in run() - task 02083763-bbaf-05ea-abc5-000000001107 30583 1726853722.62654: variable 'ansible_search_path' from source: unknown 30583 1726853722.62660: variable 'ansible_search_path' from source: unknown 30583 1726853722.62806: calling self._execute() 30583 1726853722.62816: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853722.62823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853722.62834: variable 'omit' from source: magic vars 30583 1726853722.63247: variable 'ansible_distribution_major_version' from source: facts 30583 1726853722.63260: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853722.63263: variable 'omit' from source: magic vars 30583 1726853722.63344: variable 'omit' from source: magic vars 30583 1726853722.63373: variable 'omit' from source: magic vars 30583 1726853722.63414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853722.63461: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853722.63566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853722.63569: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853722.63573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853722.63576: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853722.63578: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853722.63581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853722.63647: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853722.63653: Set connection var ansible_timeout to 10 30583 1726853722.63666: Set connection var ansible_connection to ssh 30583 1726853722.63675: Set connection var ansible_shell_executable to /bin/sh 30583 1726853722.63677: Set connection var ansible_shell_type to sh 30583 1726853722.63687: Set connection var ansible_pipelining to False 30583 1726853722.63711: variable 'ansible_shell_executable' from source: unknown 30583 1726853722.63714: variable 'ansible_connection' from source: unknown 30583 1726853722.63717: variable 'ansible_module_compression' from source: unknown 30583 1726853722.63720: variable 'ansible_shell_type' from source: unknown 30583 1726853722.63722: variable 'ansible_shell_executable' from source: unknown 30583 1726853722.63724: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853722.63726: variable 'ansible_pipelining' from source: unknown 30583 1726853722.63730: variable 'ansible_timeout' from source: unknown 30583 1726853722.63734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853722.63886: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853722.63897: variable 'omit' from source: magic vars 30583 1726853722.63902: starting attempt loop 30583 1726853722.63905: running the handler 30583 1726853722.64049: variable '__network_connections_result' from source: set_fact 30583 1726853722.64110: handler run complete 30583 1726853722.64129: attempt loop complete, returning result 30583 1726853722.64132: _execute() done 30583 1726853722.64135: dumping result to json 30583 1726853722.64137: done dumping result, returning 30583 1726853722.64148: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-000000001107] 30583 1726853722.64151: sending task result for task 02083763-bbaf-05ea-abc5-000000001107 30583 1726853722.64356: done sending task result for task 02083763-bbaf-05ea-abc5-000000001107 30583 1726853722.64362: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840" ] } 30583 1726853722.64543: no more pending results, returning what we have 30583 1726853722.64547: results queue empty 30583 1726853722.64547: checking for any_errors_fatal 30583 1726853722.64553: done checking for any_errors_fatal 30583 1726853722.64554: checking for max_fail_percentage 30583 1726853722.64556: done checking for max_fail_percentage 30583 1726853722.64556: checking to see if all hosts have failed and the running result is not ok 30583 1726853722.64557: done checking to see if all hosts have failed 30583 1726853722.64558: getting the remaining hosts for this loop 30583 1726853722.64559: done getting the remaining hosts for this loop 30583 1726853722.64563: getting the next task for host managed_node2 30583 1726853722.64570: done getting next task for host managed_node2 30583 1726853722.64576: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853722.64581: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853722.64593: getting variables 30583 1726853722.64595: in VariableManager get_vars() 30583 1726853722.64631: Calling all_inventory to load vars for managed_node2 30583 1726853722.64634: Calling groups_inventory to load vars for managed_node2 30583 1726853722.64637: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853722.64645: Calling all_plugins_play to load vars for managed_node2 30583 1726853722.64648: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853722.64651: Calling groups_plugins_play to load vars for managed_node2 30583 1726853722.66337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853722.68897: done with get_vars() 30583 1726853722.68935: done getting variables 30583 1726853722.69004: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:35:22 -0400 (0:00:00.077) 0:00:58.027 ****** 30583 1726853722.69053: entering _queue_task() for managed_node2/debug 30583 1726853722.69680: worker is 1 (out of 1 available) 30583 1726853722.69692: exiting _queue_task() for managed_node2/debug 30583 1726853722.69703: done queuing things up, now waiting for results queue to drain 30583 1726853722.69704: waiting for pending results... 30583 1726853722.69930: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853722.70064: in run() - task 02083763-bbaf-05ea-abc5-000000001108 30583 1726853722.70078: variable 'ansible_search_path' from source: unknown 30583 1726853722.70081: variable 'ansible_search_path' from source: unknown 30583 1726853722.70125: calling self._execute() 30583 1726853722.70245: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853722.70249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853722.70329: variable 'omit' from source: magic vars 30583 1726853722.70665: variable 'ansible_distribution_major_version' from source: facts 30583 1726853722.70677: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853722.70684: variable 'omit' from source: magic vars 30583 1726853722.70746: variable 'omit' from source: magic vars 30583 1726853722.70783: variable 'omit' from source: magic vars 30583 1726853722.70830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853722.70864: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853722.70978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853722.70982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853722.70984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853722.71008: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853722.71012: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853722.71030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853722.71226: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853722.71262: Set connection var ansible_timeout to 10 30583 1726853722.71266: Set connection var ansible_connection to ssh 30583 1726853722.71268: Set connection var ansible_shell_executable to /bin/sh 30583 1726853722.71272: Set connection var ansible_shell_type to sh 30583 1726853722.71283: Set connection var ansible_pipelining to False 30583 1726853722.71323: variable 'ansible_shell_executable' from source: unknown 30583 1726853722.71327: variable 'ansible_connection' from source: unknown 30583 1726853722.71330: variable 'ansible_module_compression' from source: unknown 30583 1726853722.71332: variable 'ansible_shell_type' from source: unknown 30583 1726853722.71335: variable 'ansible_shell_executable' from source: unknown 30583 1726853722.71337: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853722.71339: variable 'ansible_pipelining' from source: unknown 30583 1726853722.71341: variable 'ansible_timeout' from source: unknown 30583 1726853722.71377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853722.71548: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853722.71561: variable 'omit' from source: magic vars 30583 1726853722.71564: starting attempt loop 30583 1726853722.71567: running the handler 30583 1726853722.71847: variable '__network_connections_result' from source: set_fact 30583 1726853722.72025: variable '__network_connections_result' from source: set_fact 30583 1726853722.72286: handler run complete 30583 1726853722.72289: attempt loop complete, returning result 30583 1726853722.72292: _execute() done 30583 1726853722.72294: dumping result to json 30583 1726853722.72297: done dumping result, returning 30583 1726853722.72308: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-000000001108] 30583 1726853722.72311: sending task result for task 02083763-bbaf-05ea-abc5-000000001108 30583 1726853722.72581: done sending task result for task 02083763-bbaf-05ea-abc5-000000001108 30583 1726853722.72584: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840" ] } } 30583 1726853722.72687: no more pending results, returning what we have 30583 1726853722.72692: results queue empty 30583 1726853722.72693: checking for any_errors_fatal 30583 1726853722.72701: done checking for any_errors_fatal 30583 1726853722.72702: checking for max_fail_percentage 30583 1726853722.72704: done checking for max_fail_percentage 30583 1726853722.72706: checking to see if all hosts have failed and the running result is not ok 30583 1726853722.72706: done checking to see if all hosts have failed 30583 1726853722.72707: getting the remaining hosts for this loop 30583 1726853722.72709: done getting the remaining hosts for this loop 30583 1726853722.72713: getting the next task for host managed_node2 30583 1726853722.72722: done getting next task for host managed_node2 30583 1726853722.72727: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853722.72732: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853722.72748: getting variables 30583 1726853722.72750: in VariableManager get_vars() 30583 1726853722.73028: Calling all_inventory to load vars for managed_node2 30583 1726853722.73032: Calling groups_inventory to load vars for managed_node2 30583 1726853722.73036: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853722.73047: Calling all_plugins_play to load vars for managed_node2 30583 1726853722.73051: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853722.73055: Calling groups_plugins_play to load vars for managed_node2 30583 1726853722.75454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853722.77414: done with get_vars() 30583 1726853722.77437: done getting variables 30583 1726853722.77506: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:35:22 -0400 (0:00:00.084) 0:00:58.112 ****** 30583 1726853722.77542: entering _queue_task() for managed_node2/debug 30583 1726853722.78026: worker is 1 (out of 1 available) 30583 1726853722.78039: exiting _queue_task() for managed_node2/debug 30583 1726853722.78052: done queuing things up, now waiting for results queue to drain 30583 1726853722.78054: waiting for pending results... 30583 1726853722.78724: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853722.78730: in run() - task 02083763-bbaf-05ea-abc5-000000001109 30583 1726853722.78733: variable 'ansible_search_path' from source: unknown 30583 1726853722.78735: variable 'ansible_search_path' from source: unknown 30583 1726853722.78738: calling self._execute() 30583 1726853722.78835: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853722.78839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853722.78850: variable 'omit' from source: magic vars 30583 1726853722.79629: variable 'ansible_distribution_major_version' from source: facts 30583 1726853722.79640: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853722.79889: variable 'network_state' from source: role '' defaults 30583 1726853722.79968: Evaluated conditional (network_state != {}): False 30583 1726853722.79973: when evaluation is False, skipping this task 30583 1726853722.79976: _execute() done 30583 1726853722.79978: dumping result to json 30583 1726853722.79983: done dumping result, returning 30583 1726853722.79992: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-000000001109] 30583 1726853722.80014: sending task result for task 02083763-bbaf-05ea-abc5-000000001109 30583 1726853722.80200: done sending task result for task 02083763-bbaf-05ea-abc5-000000001109 30583 1726853722.80202: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853722.80274: no more pending results, returning what we have 30583 1726853722.80278: results queue empty 30583 1726853722.80280: checking for any_errors_fatal 30583 1726853722.80290: done checking for any_errors_fatal 30583 1726853722.80291: checking for max_fail_percentage 30583 1726853722.80293: done checking for max_fail_percentage 30583 1726853722.80295: checking to see if all hosts have failed and the running result is not ok 30583 1726853722.80295: done checking to see if all hosts have failed 30583 1726853722.80296: getting the remaining hosts for this loop 30583 1726853722.80298: done getting the remaining hosts for this loop 30583 1726853722.80303: getting the next task for host managed_node2 30583 1726853722.80313: done getting next task for host managed_node2 30583 1726853722.80317: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853722.80330: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853722.80353: getting variables 30583 1726853722.80355: in VariableManager get_vars() 30583 1726853722.80397: Calling all_inventory to load vars for managed_node2 30583 1726853722.80401: Calling groups_inventory to load vars for managed_node2 30583 1726853722.80403: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853722.80415: Calling all_plugins_play to load vars for managed_node2 30583 1726853722.80418: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853722.80421: Calling groups_plugins_play to load vars for managed_node2 30583 1726853722.81987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853722.84163: done with get_vars() 30583 1726853722.84203: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:35:22 -0400 (0:00:00.067) 0:00:58.180 ****** 30583 1726853722.84347: entering _queue_task() for managed_node2/ping 30583 1726853722.85139: worker is 1 (out of 1 available) 30583 1726853722.85152: exiting _queue_task() for managed_node2/ping 30583 1726853722.85163: done queuing things up, now waiting for results queue to drain 30583 1726853722.85165: waiting for pending results... 30583 1726853722.85636: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853722.85642: in run() - task 02083763-bbaf-05ea-abc5-00000000110a 30583 1726853722.85645: variable 'ansible_search_path' from source: unknown 30583 1726853722.85649: variable 'ansible_search_path' from source: unknown 30583 1726853722.85652: calling self._execute() 30583 1726853722.85760: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853722.85764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853722.85795: variable 'omit' from source: magic vars 30583 1726853722.86261: variable 'ansible_distribution_major_version' from source: facts 30583 1726853722.86266: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853722.86269: variable 'omit' from source: magic vars 30583 1726853722.86440: variable 'omit' from source: magic vars 30583 1726853722.86448: variable 'omit' from source: magic vars 30583 1726853722.86521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853722.86526: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853722.86732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853722.86736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853722.86738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853722.86741: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853722.86743: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853722.86745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853722.86756: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853722.86764: Set connection var ansible_timeout to 10 30583 1726853722.86767: Set connection var ansible_connection to ssh 30583 1726853722.86822: Set connection var ansible_shell_executable to /bin/sh 30583 1726853722.86825: Set connection var ansible_shell_type to sh 30583 1726853722.86827: Set connection var ansible_pipelining to False 30583 1726853722.86845: variable 'ansible_shell_executable' from source: unknown 30583 1726853722.86853: variable 'ansible_connection' from source: unknown 30583 1726853722.86857: variable 'ansible_module_compression' from source: unknown 30583 1726853722.86862: variable 'ansible_shell_type' from source: unknown 30583 1726853722.86864: variable 'ansible_shell_executable' from source: unknown 30583 1726853722.86866: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853722.86869: variable 'ansible_pipelining' from source: unknown 30583 1726853722.86872: variable 'ansible_timeout' from source: unknown 30583 1726853722.86874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853722.87311: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853722.87320: variable 'omit' from source: magic vars 30583 1726853722.87325: starting attempt loop 30583 1726853722.87328: running the handler 30583 1726853722.87382: _low_level_execute_command(): starting 30583 1726853722.87385: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853722.88692: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853722.88696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853722.88699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853722.88713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853722.88725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853722.88733: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853722.88742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853722.88756: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853722.88793: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853722.88898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853722.88902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853722.88982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853722.90856: stdout chunk (state=3): >>>/root <<< 30583 1726853722.90862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853722.90903: stderr chunk (state=3): >>><<< 30583 1726853722.90906: stdout chunk (state=3): >>><<< 30583 1726853722.91176: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853722.91181: _low_level_execute_command(): starting 30583 1726853722.91184: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356 `" && echo ansible-tmp-1726853722.910548-33376-50117608811356="` echo /root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356 `" ) && sleep 0' 30583 1726853722.92454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853722.92465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853722.92503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853722.92511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853722.92730: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853722.92834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853722.94932: stdout chunk (state=3): >>>ansible-tmp-1726853722.910548-33376-50117608811356=/root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356 <<< 30583 1726853722.95186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853722.95189: stdout chunk (state=3): >>><<< 30583 1726853722.95191: stderr chunk (state=3): >>><<< 30583 1726853722.95304: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853722.910548-33376-50117608811356=/root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853722.95307: variable 'ansible_module_compression' from source: unknown 30583 1726853722.95375: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853722.95425: variable 'ansible_facts' from source: unknown 30583 1726853722.95533: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356/AnsiballZ_ping.py 30583 1726853722.95741: Sending initial data 30583 1726853722.95744: Sent initial data (151 bytes) 30583 1726853722.96530: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853722.96645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853722.96819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853722.98510: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853722.98574: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853722.98918: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpd2v6e8_j /root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356/AnsiballZ_ping.py <<< 30583 1726853722.98922: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356/AnsiballZ_ping.py" <<< 30583 1726853722.98998: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpd2v6e8_j" to remote "/root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356/AnsiballZ_ping.py" <<< 30583 1726853723.00236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853723.00309: stderr chunk (state=3): >>><<< 30583 1726853723.00327: stdout chunk (state=3): >>><<< 30583 1726853723.00366: done transferring module to remote 30583 1726853723.00393: _low_level_execute_command(): starting 30583 1726853723.00403: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356/ /root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356/AnsiballZ_ping.py && sleep 0' 30583 1726853723.01161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853723.01190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853723.01210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853723.01308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853723.01351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853723.01382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853723.01542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853723.01697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853723.03604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853723.03672: stderr chunk (state=3): >>><<< 30583 1726853723.03675: stdout chunk (state=3): >>><<< 30583 1726853723.03690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853723.03693: _low_level_execute_command(): starting 30583 1726853723.03700: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356/AnsiballZ_ping.py && sleep 0' 30583 1726853723.04342: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853723.04355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853723.04374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853723.04391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853723.04406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853723.04498: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853723.04536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853723.04563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853723.04592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853723.04695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853723.20395: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853723.22006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853723.22012: stdout chunk (state=3): >>><<< 30583 1726853723.22015: stderr chunk (state=3): >>><<< 30583 1726853723.22019: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853723.22022: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853723.22025: _low_level_execute_command(): starting 30583 1726853723.22028: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853722.910548-33376-50117608811356/ > /dev/null 2>&1 && sleep 0' 30583 1726853723.23204: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853723.23307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853723.23311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853723.23315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853723.23317: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853723.23349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853723.23363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853723.23598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853723.23627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853723.25641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853723.25645: stdout chunk (state=3): >>><<< 30583 1726853723.25877: stderr chunk (state=3): >>><<< 30583 1726853723.25882: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853723.25884: handler run complete 30583 1726853723.25887: attempt loop complete, returning result 30583 1726853723.25889: _execute() done 30583 1726853723.25891: dumping result to json 30583 1726853723.25893: done dumping result, returning 30583 1726853723.25895: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-00000000110a] 30583 1726853723.25897: sending task result for task 02083763-bbaf-05ea-abc5-00000000110a 30583 1726853723.25961: done sending task result for task 02083763-bbaf-05ea-abc5-00000000110a 30583 1726853723.25965: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853723.26111: no more pending results, returning what we have 30583 1726853723.26115: results queue empty 30583 1726853723.26116: checking for any_errors_fatal 30583 1726853723.26124: done checking for any_errors_fatal 30583 1726853723.26124: checking for max_fail_percentage 30583 1726853723.26127: done checking for max_fail_percentage 30583 1726853723.26128: checking to see if all hosts have failed and the running result is not ok 30583 1726853723.26129: done checking to see if all hosts have failed 30583 1726853723.26129: getting the remaining hosts for this loop 30583 1726853723.26131: done getting the remaining hosts for this loop 30583 1726853723.26135: getting the next task for host managed_node2 30583 1726853723.26148: done getting next task for host managed_node2 30583 1726853723.26150: ^ task is: TASK: meta (role_complete) 30583 1726853723.26379: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853723.26395: getting variables 30583 1726853723.26402: in VariableManager get_vars() 30583 1726853723.26450: Calling all_inventory to load vars for managed_node2 30583 1726853723.26453: Calling groups_inventory to load vars for managed_node2 30583 1726853723.26456: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853723.26468: Calling all_plugins_play to load vars for managed_node2 30583 1726853723.26777: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853723.26783: Calling groups_plugins_play to load vars for managed_node2 30583 1726853723.29032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853723.31134: done with get_vars() 30583 1726853723.31169: done getting variables 30583 1726853723.31270: done queuing things up, now waiting for results queue to drain 30583 1726853723.31274: results queue empty 30583 1726853723.31275: checking for any_errors_fatal 30583 1726853723.31278: done checking for any_errors_fatal 30583 1726853723.31279: checking for max_fail_percentage 30583 1726853723.31280: done checking for max_fail_percentage 30583 1726853723.31281: checking to see if all hosts have failed and the running result is not ok 30583 1726853723.31282: done checking to see if all hosts have failed 30583 1726853723.31283: getting the remaining hosts for this loop 30583 1726853723.31284: done getting the remaining hosts for this loop 30583 1726853723.31287: getting the next task for host managed_node2 30583 1726853723.31292: done getting next task for host managed_node2 30583 1726853723.31294: ^ task is: TASK: Show result 30583 1726853723.31297: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853723.31308: getting variables 30583 1726853723.31310: in VariableManager get_vars() 30583 1726853723.31323: Calling all_inventory to load vars for managed_node2 30583 1726853723.31325: Calling groups_inventory to load vars for managed_node2 30583 1726853723.31327: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853723.31349: Calling all_plugins_play to load vars for managed_node2 30583 1726853723.31351: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853723.31354: Calling groups_plugins_play to load vars for managed_node2 30583 1726853723.33185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853723.34922: done with get_vars() 30583 1726853723.34955: done getting variables 30583 1726853723.35001: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 13:35:23 -0400 (0:00:00.506) 0:00:58.687 ****** 30583 1726853723.35041: entering _queue_task() for managed_node2/debug 30583 1726853723.35739: worker is 1 (out of 1 available) 30583 1726853723.35753: exiting _queue_task() for managed_node2/debug 30583 1726853723.35766: done queuing things up, now waiting for results queue to drain 30583 1726853723.35767: waiting for pending results... 30583 1726853723.36284: running TaskExecutor() for managed_node2/TASK: Show result 30583 1726853723.36431: in run() - task 02083763-bbaf-05ea-abc5-000000001090 30583 1726853723.36451: variable 'ansible_search_path' from source: unknown 30583 1726853723.36463: variable 'ansible_search_path' from source: unknown 30583 1726853723.36506: calling self._execute() 30583 1726853723.36613: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853723.36629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853723.36646: variable 'omit' from source: magic vars 30583 1726853723.37040: variable 'ansible_distribution_major_version' from source: facts 30583 1726853723.37069: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853723.37084: variable 'omit' from source: magic vars 30583 1726853723.37141: variable 'omit' from source: magic vars 30583 1726853723.37188: variable 'omit' from source: magic vars 30583 1726853723.37232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853723.37275: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853723.37305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853723.37327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853723.37343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853723.37380: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853723.37394: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853723.37402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853723.37512: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853723.37525: Set connection var ansible_timeout to 10 30583 1726853723.37534: Set connection var ansible_connection to ssh 30583 1726853723.37545: Set connection var ansible_shell_executable to /bin/sh 30583 1726853723.37553: Set connection var ansible_shell_type to sh 30583 1726853723.37573: Set connection var ansible_pipelining to False 30583 1726853723.37600: variable 'ansible_shell_executable' from source: unknown 30583 1726853723.37667: variable 'ansible_connection' from source: unknown 30583 1726853723.37673: variable 'ansible_module_compression' from source: unknown 30583 1726853723.37676: variable 'ansible_shell_type' from source: unknown 30583 1726853723.37678: variable 'ansible_shell_executable' from source: unknown 30583 1726853723.37680: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853723.37683: variable 'ansible_pipelining' from source: unknown 30583 1726853723.37685: variable 'ansible_timeout' from source: unknown 30583 1726853723.37687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853723.37778: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853723.37787: variable 'omit' from source: magic vars 30583 1726853723.37792: starting attempt loop 30583 1726853723.37795: running the handler 30583 1726853723.37836: variable '__network_connections_result' from source: set_fact 30583 1726853723.37900: variable '__network_connections_result' from source: set_fact 30583 1726853723.37990: handler run complete 30583 1726853723.38007: attempt loop complete, returning result 30583 1726853723.38010: _execute() done 30583 1726853723.38012: dumping result to json 30583 1726853723.38016: done dumping result, returning 30583 1726853723.38025: done running TaskExecutor() for managed_node2/TASK: Show result [02083763-bbaf-05ea-abc5-000000001090] 30583 1726853723.38027: sending task result for task 02083763-bbaf-05ea-abc5-000000001090 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840" ] } } 30583 1726853723.38283: no more pending results, returning what we have 30583 1726853723.38287: results queue empty 30583 1726853723.38288: checking for any_errors_fatal 30583 1726853723.38290: done checking for any_errors_fatal 30583 1726853723.38291: checking for max_fail_percentage 30583 1726853723.38293: done checking for max_fail_percentage 30583 1726853723.38294: checking to see if all hosts have failed and the running result is not ok 30583 1726853723.38294: done checking to see if all hosts have failed 30583 1726853723.38295: getting the remaining hosts for this loop 30583 1726853723.38304: done getting the remaining hosts for this loop 30583 1726853723.38308: getting the next task for host managed_node2 30583 1726853723.38319: done getting next task for host managed_node2 30583 1726853723.38323: ^ task is: TASK: Include network role 30583 1726853723.38326: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853723.38330: getting variables 30583 1726853723.38331: in VariableManager get_vars() 30583 1726853723.38373: Calling all_inventory to load vars for managed_node2 30583 1726853723.38376: Calling groups_inventory to load vars for managed_node2 30583 1726853723.38380: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853723.38387: done sending task result for task 02083763-bbaf-05ea-abc5-000000001090 30583 1726853723.38390: WORKER PROCESS EXITING 30583 1726853723.38399: Calling all_plugins_play to load vars for managed_node2 30583 1726853723.38403: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853723.38416: Calling groups_plugins_play to load vars for managed_node2 30583 1726853723.44852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853723.47079: done with get_vars() 30583 1726853723.47110: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 13:35:23 -0400 (0:00:00.121) 0:00:58.809 ****** 30583 1726853723.47194: entering _queue_task() for managed_node2/include_role 30583 1726853723.47554: worker is 1 (out of 1 available) 30583 1726853723.47567: exiting _queue_task() for managed_node2/include_role 30583 1726853723.47581: done queuing things up, now waiting for results queue to drain 30583 1726853723.47583: waiting for pending results... 30583 1726853723.47967: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853723.48021: in run() - task 02083763-bbaf-05ea-abc5-000000001094 30583 1726853723.48035: variable 'ansible_search_path' from source: unknown 30583 1726853723.48040: variable 'ansible_search_path' from source: unknown 30583 1726853723.48078: calling self._execute() 30583 1726853723.48176: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853723.48181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853723.48196: variable 'omit' from source: magic vars 30583 1726853723.48580: variable 'ansible_distribution_major_version' from source: facts 30583 1726853723.48592: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853723.48601: _execute() done 30583 1726853723.48604: dumping result to json 30583 1726853723.48607: done dumping result, returning 30583 1726853723.48610: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-000000001094] 30583 1726853723.48631: sending task result for task 02083763-bbaf-05ea-abc5-000000001094 30583 1726853723.48822: no more pending results, returning what we have 30583 1726853723.48828: in VariableManager get_vars() 30583 1726853723.48874: Calling all_inventory to load vars for managed_node2 30583 1726853723.48877: Calling groups_inventory to load vars for managed_node2 30583 1726853723.48881: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853723.48895: Calling all_plugins_play to load vars for managed_node2 30583 1726853723.48898: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853723.48902: Calling groups_plugins_play to load vars for managed_node2 30583 1726853723.49450: done sending task result for task 02083763-bbaf-05ea-abc5-000000001094 30583 1726853723.49454: WORKER PROCESS EXITING 30583 1726853723.50609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853723.54145: done with get_vars() 30583 1726853723.54381: variable 'ansible_search_path' from source: unknown 30583 1726853723.54383: variable 'ansible_search_path' from source: unknown 30583 1726853723.54703: variable 'omit' from source: magic vars 30583 1726853723.54977: variable 'omit' from source: magic vars 30583 1726853723.54993: variable 'omit' from source: magic vars 30583 1726853723.54997: we have included files to process 30583 1726853723.54998: generating all_blocks data 30583 1726853723.55002: done generating all_blocks data 30583 1726853723.55007: processing included file: fedora.linux_system_roles.network 30583 1726853723.55028: in VariableManager get_vars() 30583 1726853723.55232: done with get_vars() 30583 1726853723.55380: in VariableManager get_vars() 30583 1726853723.55399: done with get_vars() 30583 1726853723.55437: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853723.55668: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853723.55781: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853723.56374: in VariableManager get_vars() 30583 1726853723.56446: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853723.58773: iterating over new_blocks loaded from include file 30583 1726853723.58776: in VariableManager get_vars() 30583 1726853723.58793: done with get_vars() 30583 1726853723.58795: filtering new block on tags 30583 1726853723.59074: done filtering new block on tags 30583 1726853723.59079: in VariableManager get_vars() 30583 1726853723.59094: done with get_vars() 30583 1726853723.59096: filtering new block on tags 30583 1726853723.59112: done filtering new block on tags 30583 1726853723.59114: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853723.59119: extending task lists for all hosts with included blocks 30583 1726853723.59229: done extending task lists 30583 1726853723.59231: done processing included files 30583 1726853723.59231: results queue empty 30583 1726853723.59232: checking for any_errors_fatal 30583 1726853723.59237: done checking for any_errors_fatal 30583 1726853723.59238: checking for max_fail_percentage 30583 1726853723.59240: done checking for max_fail_percentage 30583 1726853723.59240: checking to see if all hosts have failed and the running result is not ok 30583 1726853723.59241: done checking to see if all hosts have failed 30583 1726853723.59242: getting the remaining hosts for this loop 30583 1726853723.59243: done getting the remaining hosts for this loop 30583 1726853723.59245: getting the next task for host managed_node2 30583 1726853723.59250: done getting next task for host managed_node2 30583 1726853723.59253: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853723.59255: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853723.59268: getting variables 30583 1726853723.59269: in VariableManager get_vars() 30583 1726853723.59283: Calling all_inventory to load vars for managed_node2 30583 1726853723.59286: Calling groups_inventory to load vars for managed_node2 30583 1726853723.59287: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853723.59293: Calling all_plugins_play to load vars for managed_node2 30583 1726853723.59295: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853723.59298: Calling groups_plugins_play to load vars for managed_node2 30583 1726853723.60711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853723.62282: done with get_vars() 30583 1726853723.62305: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:35:23 -0400 (0:00:00.152) 0:00:58.961 ****** 30583 1726853723.62423: entering _queue_task() for managed_node2/include_tasks 30583 1726853723.62792: worker is 1 (out of 1 available) 30583 1726853723.62804: exiting _queue_task() for managed_node2/include_tasks 30583 1726853723.62815: done queuing things up, now waiting for results queue to drain 30583 1726853723.62816: waiting for pending results... 30583 1726853723.63122: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853723.63329: in run() - task 02083763-bbaf-05ea-abc5-00000000127a 30583 1726853723.63332: variable 'ansible_search_path' from source: unknown 30583 1726853723.63335: variable 'ansible_search_path' from source: unknown 30583 1726853723.63338: calling self._execute() 30583 1726853723.63418: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853723.63423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853723.63433: variable 'omit' from source: magic vars 30583 1726853723.63789: variable 'ansible_distribution_major_version' from source: facts 30583 1726853723.63800: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853723.63806: _execute() done 30583 1726853723.63809: dumping result to json 30583 1726853723.63811: done dumping result, returning 30583 1726853723.63820: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-00000000127a] 30583 1726853723.63823: sending task result for task 02083763-bbaf-05ea-abc5-00000000127a 30583 1726853723.63921: done sending task result for task 02083763-bbaf-05ea-abc5-00000000127a 30583 1726853723.63924: WORKER PROCESS EXITING 30583 1726853723.63979: no more pending results, returning what we have 30583 1726853723.63985: in VariableManager get_vars() 30583 1726853723.64027: Calling all_inventory to load vars for managed_node2 30583 1726853723.64030: Calling groups_inventory to load vars for managed_node2 30583 1726853723.64033: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853723.64043: Calling all_plugins_play to load vars for managed_node2 30583 1726853723.64046: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853723.64049: Calling groups_plugins_play to load vars for managed_node2 30583 1726853723.65605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853723.68236: done with get_vars() 30583 1726853723.68264: variable 'ansible_search_path' from source: unknown 30583 1726853723.68265: variable 'ansible_search_path' from source: unknown 30583 1726853723.68309: we have included files to process 30583 1726853723.68310: generating all_blocks data 30583 1726853723.68312: done generating all_blocks data 30583 1726853723.68315: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853723.68316: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853723.68319: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853723.69114: done processing included file 30583 1726853723.69116: iterating over new_blocks loaded from include file 30583 1726853723.69118: in VariableManager get_vars() 30583 1726853723.69144: done with get_vars() 30583 1726853723.69146: filtering new block on tags 30583 1726853723.69187: done filtering new block on tags 30583 1726853723.69190: in VariableManager get_vars() 30583 1726853723.69213: done with get_vars() 30583 1726853723.69215: filtering new block on tags 30583 1726853723.69259: done filtering new block on tags 30583 1726853723.69262: in VariableManager get_vars() 30583 1726853723.69293: done with get_vars() 30583 1726853723.69295: filtering new block on tags 30583 1726853723.69338: done filtering new block on tags 30583 1726853723.69340: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853723.69345: extending task lists for all hosts with included blocks 30583 1726853723.72803: done extending task lists 30583 1726853723.72805: done processing included files 30583 1726853723.72806: results queue empty 30583 1726853723.72806: checking for any_errors_fatal 30583 1726853723.72810: done checking for any_errors_fatal 30583 1726853723.72811: checking for max_fail_percentage 30583 1726853723.72812: done checking for max_fail_percentage 30583 1726853723.72813: checking to see if all hosts have failed and the running result is not ok 30583 1726853723.72814: done checking to see if all hosts have failed 30583 1726853723.72814: getting the remaining hosts for this loop 30583 1726853723.72816: done getting the remaining hosts for this loop 30583 1726853723.72819: getting the next task for host managed_node2 30583 1726853723.72824: done getting next task for host managed_node2 30583 1726853723.73080: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853723.73085: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853723.73097: getting variables 30583 1726853723.73100: in VariableManager get_vars() 30583 1726853723.73120: Calling all_inventory to load vars for managed_node2 30583 1726853723.73123: Calling groups_inventory to load vars for managed_node2 30583 1726853723.73125: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853723.73131: Calling all_plugins_play to load vars for managed_node2 30583 1726853723.73134: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853723.73137: Calling groups_plugins_play to load vars for managed_node2 30583 1726853723.75824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853723.79360: done with get_vars() 30583 1726853723.79386: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:35:23 -0400 (0:00:00.170) 0:00:59.132 ****** 30583 1726853723.79492: entering _queue_task() for managed_node2/setup 30583 1726853723.80029: worker is 1 (out of 1 available) 30583 1726853723.80038: exiting _queue_task() for managed_node2/setup 30583 1726853723.80050: done queuing things up, now waiting for results queue to drain 30583 1726853723.80051: waiting for pending results... 30583 1726853723.80329: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853723.80428: in run() - task 02083763-bbaf-05ea-abc5-0000000012d1 30583 1726853723.80433: variable 'ansible_search_path' from source: unknown 30583 1726853723.80436: variable 'ansible_search_path' from source: unknown 30583 1726853723.80663: calling self._execute() 30583 1726853723.80668: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853723.80673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853723.80677: variable 'omit' from source: magic vars 30583 1726853723.81064: variable 'ansible_distribution_major_version' from source: facts 30583 1726853723.81068: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853723.81280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853723.84776: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853723.84865: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853723.84912: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853723.84953: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853723.84993: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853723.85068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853723.85111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853723.85141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853723.85194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853723.85214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853723.85267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853723.85297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853723.85325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853723.85373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853723.85391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853723.85564: variable '__network_required_facts' from source: role '' defaults 30583 1726853723.85581: variable 'ansible_facts' from source: unknown 30583 1726853723.87115: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853723.87127: when evaluation is False, skipping this task 30583 1726853723.87332: _execute() done 30583 1726853723.87335: dumping result to json 30583 1726853723.87338: done dumping result, returning 30583 1726853723.87341: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-0000000012d1] 30583 1726853723.87343: sending task result for task 02083763-bbaf-05ea-abc5-0000000012d1 30583 1726853723.87420: done sending task result for task 02083763-bbaf-05ea-abc5-0000000012d1 30583 1726853723.87424: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853723.87475: no more pending results, returning what we have 30583 1726853723.87483: results queue empty 30583 1726853723.87485: checking for any_errors_fatal 30583 1726853723.87486: done checking for any_errors_fatal 30583 1726853723.87487: checking for max_fail_percentage 30583 1726853723.87489: done checking for max_fail_percentage 30583 1726853723.87490: checking to see if all hosts have failed and the running result is not ok 30583 1726853723.87491: done checking to see if all hosts have failed 30583 1726853723.87492: getting the remaining hosts for this loop 30583 1726853723.87494: done getting the remaining hosts for this loop 30583 1726853723.87497: getting the next task for host managed_node2 30583 1726853723.87508: done getting next task for host managed_node2 30583 1726853723.87512: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853723.87517: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853723.87541: getting variables 30583 1726853723.87543: in VariableManager get_vars() 30583 1726853723.87677: Calling all_inventory to load vars for managed_node2 30583 1726853723.87680: Calling groups_inventory to load vars for managed_node2 30583 1726853723.87683: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853723.87693: Calling all_plugins_play to load vars for managed_node2 30583 1726853723.87774: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853723.87786: Calling groups_plugins_play to load vars for managed_node2 30583 1726853723.89575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853723.91268: done with get_vars() 30583 1726853723.91295: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:35:23 -0400 (0:00:00.119) 0:00:59.251 ****** 30583 1726853723.91406: entering _queue_task() for managed_node2/stat 30583 1726853723.91825: worker is 1 (out of 1 available) 30583 1726853723.91841: exiting _queue_task() for managed_node2/stat 30583 1726853723.91853: done queuing things up, now waiting for results queue to drain 30583 1726853723.91855: waiting for pending results... 30583 1726853723.92504: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853723.92709: in run() - task 02083763-bbaf-05ea-abc5-0000000012d3 30583 1726853723.92717: variable 'ansible_search_path' from source: unknown 30583 1726853723.92726: variable 'ansible_search_path' from source: unknown 30583 1726853723.92775: calling self._execute() 30583 1726853723.93033: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853723.93045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853723.93064: variable 'omit' from source: magic vars 30583 1726853723.93912: variable 'ansible_distribution_major_version' from source: facts 30583 1726853723.93994: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853723.94333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853723.94813: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853723.94874: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853723.94918: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853723.94955: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853723.95055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853723.95100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853723.95131: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853723.95184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853723.95276: variable '__network_is_ostree' from source: set_fact 30583 1726853723.95292: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853723.95319: when evaluation is False, skipping this task 30583 1726853723.95322: _execute() done 30583 1726853723.95325: dumping result to json 30583 1726853723.95328: done dumping result, returning 30583 1726853723.95335: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-0000000012d3] 30583 1726853723.95403: sending task result for task 02083763-bbaf-05ea-abc5-0000000012d3 30583 1726853723.95481: done sending task result for task 02083763-bbaf-05ea-abc5-0000000012d3 30583 1726853723.95485: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853723.95563: no more pending results, returning what we have 30583 1726853723.95568: results queue empty 30583 1726853723.95569: checking for any_errors_fatal 30583 1726853723.95579: done checking for any_errors_fatal 30583 1726853723.95580: checking for max_fail_percentage 30583 1726853723.95583: done checking for max_fail_percentage 30583 1726853723.95584: checking to see if all hosts have failed and the running result is not ok 30583 1726853723.95585: done checking to see if all hosts have failed 30583 1726853723.95585: getting the remaining hosts for this loop 30583 1726853723.95587: done getting the remaining hosts for this loop 30583 1726853723.95592: getting the next task for host managed_node2 30583 1726853723.95603: done getting next task for host managed_node2 30583 1726853723.95607: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853723.95612: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853723.95639: getting variables 30583 1726853723.95642: in VariableManager get_vars() 30583 1726853723.95794: Calling all_inventory to load vars for managed_node2 30583 1726853723.95797: Calling groups_inventory to load vars for managed_node2 30583 1726853723.95800: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853723.95811: Calling all_plugins_play to load vars for managed_node2 30583 1726853723.95815: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853723.95818: Calling groups_plugins_play to load vars for managed_node2 30583 1726853723.97652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853723.99444: done with get_vars() 30583 1726853723.99711: done getting variables 30583 1726853723.99890: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:35:23 -0400 (0:00:00.085) 0:00:59.336 ****** 30583 1726853723.99931: entering _queue_task() for managed_node2/set_fact 30583 1726853724.00699: worker is 1 (out of 1 available) 30583 1726853724.00713: exiting _queue_task() for managed_node2/set_fact 30583 1726853724.00727: done queuing things up, now waiting for results queue to drain 30583 1726853724.00728: waiting for pending results... 30583 1726853724.01390: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853724.01632: in run() - task 02083763-bbaf-05ea-abc5-0000000012d4 30583 1726853724.01637: variable 'ansible_search_path' from source: unknown 30583 1726853724.01639: variable 'ansible_search_path' from source: unknown 30583 1726853724.01763: calling self._execute() 30583 1726853724.01960: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853724.01963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853724.01967: variable 'omit' from source: magic vars 30583 1726853724.02826: variable 'ansible_distribution_major_version' from source: facts 30583 1726853724.02935: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853724.03496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853724.04180: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853724.04236: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853724.04269: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853724.04483: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853724.04600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853724.04604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853724.04710: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853724.04756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853724.04969: variable '__network_is_ostree' from source: set_fact 30583 1726853724.04977: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853724.04980: when evaluation is False, skipping this task 30583 1726853724.04983: _execute() done 30583 1726853724.04985: dumping result to json 30583 1726853724.04988: done dumping result, returning 30583 1726853724.04999: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-0000000012d4] 30583 1726853724.05002: sending task result for task 02083763-bbaf-05ea-abc5-0000000012d4 30583 1726853724.05376: done sending task result for task 02083763-bbaf-05ea-abc5-0000000012d4 30583 1726853724.05380: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853724.05436: no more pending results, returning what we have 30583 1726853724.05441: results queue empty 30583 1726853724.05442: checking for any_errors_fatal 30583 1726853724.05449: done checking for any_errors_fatal 30583 1726853724.05450: checking for max_fail_percentage 30583 1726853724.05453: done checking for max_fail_percentage 30583 1726853724.05454: checking to see if all hosts have failed and the running result is not ok 30583 1726853724.05455: done checking to see if all hosts have failed 30583 1726853724.05456: getting the remaining hosts for this loop 30583 1726853724.05460: done getting the remaining hosts for this loop 30583 1726853724.05465: getting the next task for host managed_node2 30583 1726853724.05480: done getting next task for host managed_node2 30583 1726853724.05484: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853724.05494: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853724.05524: getting variables 30583 1726853724.05527: in VariableManager get_vars() 30583 1726853724.05700: Calling all_inventory to load vars for managed_node2 30583 1726853724.05704: Calling groups_inventory to load vars for managed_node2 30583 1726853724.05707: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853724.05716: Calling all_plugins_play to load vars for managed_node2 30583 1726853724.05719: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853724.05721: Calling groups_plugins_play to load vars for managed_node2 30583 1726853724.08866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853724.12269: done with get_vars() 30583 1726853724.12303: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:35:24 -0400 (0:00:00.126) 0:00:59.462 ****** 30583 1726853724.12537: entering _queue_task() for managed_node2/service_facts 30583 1726853724.13308: worker is 1 (out of 1 available) 30583 1726853724.13319: exiting _queue_task() for managed_node2/service_facts 30583 1726853724.13332: done queuing things up, now waiting for results queue to drain 30583 1726853724.13333: waiting for pending results... 30583 1726853724.13869: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853724.14428: in run() - task 02083763-bbaf-05ea-abc5-0000000012d6 30583 1726853724.14432: variable 'ansible_search_path' from source: unknown 30583 1726853724.14435: variable 'ansible_search_path' from source: unknown 30583 1726853724.14438: calling self._execute() 30583 1726853724.14551: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853724.14593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853724.14631: variable 'omit' from source: magic vars 30583 1726853724.15482: variable 'ansible_distribution_major_version' from source: facts 30583 1726853724.15505: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853724.15519: variable 'omit' from source: magic vars 30583 1726853724.15819: variable 'omit' from source: magic vars 30583 1726853724.15823: variable 'omit' from source: magic vars 30583 1726853724.15893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853724.15996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853724.16028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853724.16256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853724.16262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853724.16265: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853724.16267: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853724.16269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853724.16340: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853724.16578: Set connection var ansible_timeout to 10 30583 1726853724.16582: Set connection var ansible_connection to ssh 30583 1726853724.16584: Set connection var ansible_shell_executable to /bin/sh 30583 1726853724.16586: Set connection var ansible_shell_type to sh 30583 1726853724.16589: Set connection var ansible_pipelining to False 30583 1726853724.16591: variable 'ansible_shell_executable' from source: unknown 30583 1726853724.16593: variable 'ansible_connection' from source: unknown 30583 1726853724.16595: variable 'ansible_module_compression' from source: unknown 30583 1726853724.16597: variable 'ansible_shell_type' from source: unknown 30583 1726853724.16599: variable 'ansible_shell_executable' from source: unknown 30583 1726853724.16601: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853724.16603: variable 'ansible_pipelining' from source: unknown 30583 1726853724.16605: variable 'ansible_timeout' from source: unknown 30583 1726853724.16606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853724.17056: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853724.17078: variable 'omit' from source: magic vars 30583 1726853724.17088: starting attempt loop 30583 1726853724.17094: running the handler 30583 1726853724.17116: _low_level_execute_command(): starting 30583 1726853724.17137: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853724.17895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853724.17992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853724.18024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853724.18047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853724.18067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853724.18179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853724.20015: stdout chunk (state=3): >>>/root <<< 30583 1726853724.20141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853724.20216: stderr chunk (state=3): >>><<< 30583 1726853724.20221: stdout chunk (state=3): >>><<< 30583 1726853724.20263: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853724.20283: _low_level_execute_command(): starting 30583 1726853724.20361: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121 `" && echo ansible-tmp-1726853724.2026398-33439-53812376290121="` echo /root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121 `" ) && sleep 0' 30583 1726853724.21011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853724.21015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853724.21025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853724.21041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853724.21058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853724.21061: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853724.21075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853724.21158: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853724.21163: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853724.21165: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853724.21167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853724.21169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853724.21178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853724.21183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853724.21185: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853724.21187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853724.21268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853724.21273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853724.21351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853724.21512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853724.23567: stdout chunk (state=3): >>>ansible-tmp-1726853724.2026398-33439-53812376290121=/root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121 <<< 30583 1726853724.23713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853724.23718: stdout chunk (state=3): >>><<< 30583 1726853724.23729: stderr chunk (state=3): >>><<< 30583 1726853724.23831: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853724.2026398-33439-53812376290121=/root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853724.23984: variable 'ansible_module_compression' from source: unknown 30583 1726853724.23988: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853724.24153: variable 'ansible_facts' from source: unknown 30583 1726853724.24362: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121/AnsiballZ_service_facts.py 30583 1726853724.24598: Sending initial data 30583 1726853724.24607: Sent initial data (161 bytes) 30583 1726853724.25187: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853724.25242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853724.25312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853724.25351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853724.25497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853724.27318: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853724.27390: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853724.27452: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp8da289i2 /root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121/AnsiballZ_service_facts.py <<< 30583 1726853724.27456: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121/AnsiballZ_service_facts.py" <<< 30583 1726853724.27541: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp8da289i2" to remote "/root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121/AnsiballZ_service_facts.py" <<< 30583 1726853724.29593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853724.29612: stderr chunk (state=3): >>><<< 30583 1726853724.29615: stdout chunk (state=3): >>><<< 30583 1726853724.29895: done transferring module to remote 30583 1726853724.29899: _low_level_execute_command(): starting 30583 1726853724.29901: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121/ /root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121/AnsiballZ_service_facts.py && sleep 0' 30583 1726853724.31551: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853724.31574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853724.31608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853724.31631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853724.31796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853724.31843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853724.31864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853724.31882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853724.32052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853724.33919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853724.33949: stderr chunk (state=3): >>><<< 30583 1726853724.33951: stdout chunk (state=3): >>><<< 30583 1726853724.33961: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853724.33981: _low_level_execute_command(): starting 30583 1726853724.33984: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121/AnsiballZ_service_facts.py && sleep 0' 30583 1726853724.34402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853724.34405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853724.34409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853724.34411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853724.34414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853724.34459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853724.34462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853724.34546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853725.95183: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30583 1726853725.95211: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30583 1726853725.95229: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 30583 1726853725.95233: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 30583 1726853725.95238: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853725.96836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853725.96875: stderr chunk (state=3): >>><<< 30583 1726853725.96878: stdout chunk (state=3): >>><<< 30583 1726853725.96905: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853725.97633: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853725.97637: _low_level_execute_command(): starting 30583 1726853725.97640: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853724.2026398-33439-53812376290121/ > /dev/null 2>&1 && sleep 0' 30583 1726853725.98120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853725.98123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853725.98126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853725.98128: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853725.98130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853725.98176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853725.98194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853725.98200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853725.98260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853726.00158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853726.00185: stderr chunk (state=3): >>><<< 30583 1726853726.00188: stdout chunk (state=3): >>><<< 30583 1726853726.00200: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853726.00207: handler run complete 30583 1726853726.00329: variable 'ansible_facts' from source: unknown 30583 1726853726.00423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853726.00706: variable 'ansible_facts' from source: unknown 30583 1726853726.00790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853726.00905: attempt loop complete, returning result 30583 1726853726.00908: _execute() done 30583 1726853726.00911: dumping result to json 30583 1726853726.00946: done dumping result, returning 30583 1726853726.00955: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-0000000012d6] 30583 1726853726.00962: sending task result for task 02083763-bbaf-05ea-abc5-0000000012d6 30583 1726853726.01739: done sending task result for task 02083763-bbaf-05ea-abc5-0000000012d6 30583 1726853726.01742: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853726.01800: no more pending results, returning what we have 30583 1726853726.01803: results queue empty 30583 1726853726.01804: checking for any_errors_fatal 30583 1726853726.01806: done checking for any_errors_fatal 30583 1726853726.01807: checking for max_fail_percentage 30583 1726853726.01808: done checking for max_fail_percentage 30583 1726853726.01809: checking to see if all hosts have failed and the running result is not ok 30583 1726853726.01809: done checking to see if all hosts have failed 30583 1726853726.01810: getting the remaining hosts for this loop 30583 1726853726.01811: done getting the remaining hosts for this loop 30583 1726853726.01813: getting the next task for host managed_node2 30583 1726853726.01817: done getting next task for host managed_node2 30583 1726853726.01819: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853726.01824: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853726.01831: getting variables 30583 1726853726.01832: in VariableManager get_vars() 30583 1726853726.01853: Calling all_inventory to load vars for managed_node2 30583 1726853726.01855: Calling groups_inventory to load vars for managed_node2 30583 1726853726.01857: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853726.01865: Calling all_plugins_play to load vars for managed_node2 30583 1726853726.01867: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853726.01874: Calling groups_plugins_play to load vars for managed_node2 30583 1726853726.02698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853726.04288: done with get_vars() 30583 1726853726.04315: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:35:26 -0400 (0:00:01.918) 0:01:01.381 ****** 30583 1726853726.04411: entering _queue_task() for managed_node2/package_facts 30583 1726853726.04777: worker is 1 (out of 1 available) 30583 1726853726.04790: exiting _queue_task() for managed_node2/package_facts 30583 1726853726.04804: done queuing things up, now waiting for results queue to drain 30583 1726853726.04805: waiting for pending results... 30583 1726853726.05166: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853726.05263: in run() - task 02083763-bbaf-05ea-abc5-0000000012d7 30583 1726853726.05280: variable 'ansible_search_path' from source: unknown 30583 1726853726.05288: variable 'ansible_search_path' from source: unknown 30583 1726853726.05326: calling self._execute() 30583 1726853726.05427: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853726.05433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853726.05445: variable 'omit' from source: magic vars 30583 1726853726.05845: variable 'ansible_distribution_major_version' from source: facts 30583 1726853726.05857: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853726.05867: variable 'omit' from source: magic vars 30583 1726853726.05945: variable 'omit' from source: magic vars 30583 1726853726.05981: variable 'omit' from source: magic vars 30583 1726853726.06024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853726.06060: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853726.06085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853726.06103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853726.06114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853726.06159: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853726.06166: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853726.06170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853726.06276: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853726.06283: Set connection var ansible_timeout to 10 30583 1726853726.06287: Set connection var ansible_connection to ssh 30583 1726853726.06291: Set connection var ansible_shell_executable to /bin/sh 30583 1726853726.06295: Set connection var ansible_shell_type to sh 30583 1726853726.06304: Set connection var ansible_pipelining to False 30583 1726853726.06329: variable 'ansible_shell_executable' from source: unknown 30583 1726853726.06332: variable 'ansible_connection' from source: unknown 30583 1726853726.06335: variable 'ansible_module_compression' from source: unknown 30583 1726853726.06337: variable 'ansible_shell_type' from source: unknown 30583 1726853726.06339: variable 'ansible_shell_executable' from source: unknown 30583 1726853726.06348: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853726.06350: variable 'ansible_pipelining' from source: unknown 30583 1726853726.06353: variable 'ansible_timeout' from source: unknown 30583 1726853726.06355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853726.06535: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853726.06543: variable 'omit' from source: magic vars 30583 1726853726.06548: starting attempt loop 30583 1726853726.06551: running the handler 30583 1726853726.06568: _low_level_execute_command(): starting 30583 1726853726.06577: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853726.07339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853726.07356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853726.07373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853726.07478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853726.07488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853726.07500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853726.07546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853726.07626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853726.09348: stdout chunk (state=3): >>>/root <<< 30583 1726853726.09492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853726.09515: stderr chunk (state=3): >>><<< 30583 1726853726.09526: stdout chunk (state=3): >>><<< 30583 1726853726.09676: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853726.09680: _low_level_execute_command(): starting 30583 1726853726.09683: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989 `" && echo ansible-tmp-1726853726.095655-33515-51966491855989="` echo /root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989 `" ) && sleep 0' 30583 1726853726.10175: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853726.10215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853726.10266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853726.10296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853726.10399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853726.12438: stdout chunk (state=3): >>>ansible-tmp-1726853726.095655-33515-51966491855989=/root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989 <<< 30583 1726853726.12682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853726.12686: stdout chunk (state=3): >>><<< 30583 1726853726.12689: stderr chunk (state=3): >>><<< 30583 1726853726.12691: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853726.095655-33515-51966491855989=/root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853726.12694: variable 'ansible_module_compression' from source: unknown 30583 1726853726.12739: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853726.12819: variable 'ansible_facts' from source: unknown 30583 1726853726.13030: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989/AnsiballZ_package_facts.py 30583 1726853726.13269: Sending initial data 30583 1726853726.13274: Sent initial data (160 bytes) 30583 1726853726.13855: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853726.13903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853726.13926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853726.13941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853726.13985: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853726.14042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853726.14052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853726.14064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853726.14250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853726.15966: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853726.16039: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853726.16112: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpneau6x0p /root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989/AnsiballZ_package_facts.py <<< 30583 1726853726.16116: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989/AnsiballZ_package_facts.py" <<< 30583 1726853726.16212: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpneau6x0p" to remote "/root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989/AnsiballZ_package_facts.py" <<< 30583 1726853726.17855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853726.17899: stderr chunk (state=3): >>><<< 30583 1726853726.17976: stdout chunk (state=3): >>><<< 30583 1726853726.17980: done transferring module to remote 30583 1726853726.17983: _low_level_execute_command(): starting 30583 1726853726.17985: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989/ /root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989/AnsiballZ_package_facts.py && sleep 0' 30583 1726853726.18767: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853726.18774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853726.18818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853726.18894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853726.20989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853726.20996: stdout chunk (state=3): >>><<< 30583 1726853726.20999: stderr chunk (state=3): >>><<< 30583 1726853726.21002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853726.21005: _low_level_execute_command(): starting 30583 1726853726.21008: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989/AnsiballZ_package_facts.py && sleep 0' 30583 1726853726.21539: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853726.21556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853726.21573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853726.21593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853726.21690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853726.21705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853726.21763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853726.21895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853726.67131: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30583 1726853726.67154: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30583 1726853726.67229: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853726.69049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853726.69178: stderr chunk (state=3): >>><<< 30583 1726853726.69181: stdout chunk (state=3): >>><<< 30583 1726853726.69194: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853726.71582: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853726.71587: _low_level_execute_command(): starting 30583 1726853726.71589: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853726.095655-33515-51966491855989/ > /dev/null 2>&1 && sleep 0' 30583 1726853726.72208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853726.72238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853726.72343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853726.72367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853726.72482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853726.74405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853726.74478: stderr chunk (state=3): >>><<< 30583 1726853726.74482: stdout chunk (state=3): >>><<< 30583 1726853726.74499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853726.74505: handler run complete 30583 1726853726.75347: variable 'ansible_facts' from source: unknown 30583 1726853726.75688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853726.76742: variable 'ansible_facts' from source: unknown 30583 1726853726.77077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853726.77785: attempt loop complete, returning result 30583 1726853726.77801: _execute() done 30583 1726853726.77809: dumping result to json 30583 1726853726.77977: done dumping result, returning 30583 1726853726.77985: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-0000000012d7] 30583 1726853726.77990: sending task result for task 02083763-bbaf-05ea-abc5-0000000012d7 30583 1726853726.79336: done sending task result for task 02083763-bbaf-05ea-abc5-0000000012d7 30583 1726853726.79339: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853726.79430: no more pending results, returning what we have 30583 1726853726.79432: results queue empty 30583 1726853726.79433: checking for any_errors_fatal 30583 1726853726.79437: done checking for any_errors_fatal 30583 1726853726.79438: checking for max_fail_percentage 30583 1726853726.79439: done checking for max_fail_percentage 30583 1726853726.79439: checking to see if all hosts have failed and the running result is not ok 30583 1726853726.79440: done checking to see if all hosts have failed 30583 1726853726.79440: getting the remaining hosts for this loop 30583 1726853726.79441: done getting the remaining hosts for this loop 30583 1726853726.79444: getting the next task for host managed_node2 30583 1726853726.79449: done getting next task for host managed_node2 30583 1726853726.79451: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853726.79454: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853726.79462: getting variables 30583 1726853726.79464: in VariableManager get_vars() 30583 1726853726.79488: Calling all_inventory to load vars for managed_node2 30583 1726853726.79490: Calling groups_inventory to load vars for managed_node2 30583 1726853726.79491: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853726.79497: Calling all_plugins_play to load vars for managed_node2 30583 1726853726.79499: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853726.79500: Calling groups_plugins_play to load vars for managed_node2 30583 1726853726.80518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853726.82293: done with get_vars() 30583 1726853726.82334: done getting variables 30583 1726853726.82401: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:26 -0400 (0:00:00.780) 0:01:02.161 ****** 30583 1726853726.82455: entering _queue_task() for managed_node2/debug 30583 1726853726.82854: worker is 1 (out of 1 available) 30583 1726853726.82983: exiting _queue_task() for managed_node2/debug 30583 1726853726.82996: done queuing things up, now waiting for results queue to drain 30583 1726853726.82998: waiting for pending results... 30583 1726853726.83289: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853726.83434: in run() - task 02083763-bbaf-05ea-abc5-00000000127b 30583 1726853726.83460: variable 'ansible_search_path' from source: unknown 30583 1726853726.83468: variable 'ansible_search_path' from source: unknown 30583 1726853726.83526: calling self._execute() 30583 1726853726.83660: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853726.83664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853726.83666: variable 'omit' from source: magic vars 30583 1726853726.84099: variable 'ansible_distribution_major_version' from source: facts 30583 1726853726.84118: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853726.84179: variable 'omit' from source: magic vars 30583 1726853726.84208: variable 'omit' from source: magic vars 30583 1726853726.84320: variable 'network_provider' from source: set_fact 30583 1726853726.84342: variable 'omit' from source: magic vars 30583 1726853726.84393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853726.84577: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853726.84580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853726.84583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853726.84586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853726.84588: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853726.84590: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853726.84593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853726.84670: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853726.84685: Set connection var ansible_timeout to 10 30583 1726853726.84692: Set connection var ansible_connection to ssh 30583 1726853726.84712: Set connection var ansible_shell_executable to /bin/sh 30583 1726853726.84724: Set connection var ansible_shell_type to sh 30583 1726853726.84738: Set connection var ansible_pipelining to False 30583 1726853726.84767: variable 'ansible_shell_executable' from source: unknown 30583 1726853726.84825: variable 'ansible_connection' from source: unknown 30583 1726853726.84828: variable 'ansible_module_compression' from source: unknown 30583 1726853726.84830: variable 'ansible_shell_type' from source: unknown 30583 1726853726.84832: variable 'ansible_shell_executable' from source: unknown 30583 1726853726.84834: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853726.84836: variable 'ansible_pipelining' from source: unknown 30583 1726853726.84838: variable 'ansible_timeout' from source: unknown 30583 1726853726.84840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853726.84982: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853726.85000: variable 'omit' from source: magic vars 30583 1726853726.85043: starting attempt loop 30583 1726853726.85047: running the handler 30583 1726853726.85083: handler run complete 30583 1726853726.85105: attempt loop complete, returning result 30583 1726853726.85112: _execute() done 30583 1726853726.85151: dumping result to json 30583 1726853726.85155: done dumping result, returning 30583 1726853726.85158: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-00000000127b] 30583 1726853726.85160: sending task result for task 02083763-bbaf-05ea-abc5-00000000127b 30583 1726853726.85372: done sending task result for task 02083763-bbaf-05ea-abc5-00000000127b 30583 1726853726.85377: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853726.85457: no more pending results, returning what we have 30583 1726853726.85461: results queue empty 30583 1726853726.85462: checking for any_errors_fatal 30583 1726853726.85590: done checking for any_errors_fatal 30583 1726853726.85596: checking for max_fail_percentage 30583 1726853726.85598: done checking for max_fail_percentage 30583 1726853726.85600: checking to see if all hosts have failed and the running result is not ok 30583 1726853726.85601: done checking to see if all hosts have failed 30583 1726853726.85602: getting the remaining hosts for this loop 30583 1726853726.85604: done getting the remaining hosts for this loop 30583 1726853726.85608: getting the next task for host managed_node2 30583 1726853726.85617: done getting next task for host managed_node2 30583 1726853726.85621: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853726.85627: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853726.85640: getting variables 30583 1726853726.85641: in VariableManager get_vars() 30583 1726853726.85790: Calling all_inventory to load vars for managed_node2 30583 1726853726.85793: Calling groups_inventory to load vars for managed_node2 30583 1726853726.85795: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853726.85805: Calling all_plugins_play to load vars for managed_node2 30583 1726853726.85807: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853726.85810: Calling groups_plugins_play to load vars for managed_node2 30583 1726853726.87402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853726.90307: done with get_vars() 30583 1726853726.90386: done getting variables 30583 1726853726.90444: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:26 -0400 (0:00:00.081) 0:01:02.243 ****** 30583 1726853726.90640: entering _queue_task() for managed_node2/fail 30583 1726853726.91318: worker is 1 (out of 1 available) 30583 1726853726.91483: exiting _queue_task() for managed_node2/fail 30583 1726853726.91495: done queuing things up, now waiting for results queue to drain 30583 1726853726.91497: waiting for pending results... 30583 1726853726.92108: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853726.92210: in run() - task 02083763-bbaf-05ea-abc5-00000000127c 30583 1726853726.92235: variable 'ansible_search_path' from source: unknown 30583 1726853726.92244: variable 'ansible_search_path' from source: unknown 30583 1726853726.92286: calling self._execute() 30583 1726853726.92396: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853726.92421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853726.92429: variable 'omit' from source: magic vars 30583 1726853726.92858: variable 'ansible_distribution_major_version' from source: facts 30583 1726853726.92862: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853726.93002: variable 'network_state' from source: role '' defaults 30583 1726853726.93019: Evaluated conditional (network_state != {}): False 30583 1726853726.93077: when evaluation is False, skipping this task 30583 1726853726.93080: _execute() done 30583 1726853726.93083: dumping result to json 30583 1726853726.93085: done dumping result, returning 30583 1726853726.93093: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-00000000127c] 30583 1726853726.93097: sending task result for task 02083763-bbaf-05ea-abc5-00000000127c skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853726.93334: no more pending results, returning what we have 30583 1726853726.93339: results queue empty 30583 1726853726.93340: checking for any_errors_fatal 30583 1726853726.93348: done checking for any_errors_fatal 30583 1726853726.93349: checking for max_fail_percentage 30583 1726853726.93351: done checking for max_fail_percentage 30583 1726853726.93352: checking to see if all hosts have failed and the running result is not ok 30583 1726853726.93353: done checking to see if all hosts have failed 30583 1726853726.93353: getting the remaining hosts for this loop 30583 1726853726.93355: done getting the remaining hosts for this loop 30583 1726853726.93359: getting the next task for host managed_node2 30583 1726853726.93368: done getting next task for host managed_node2 30583 1726853726.93421: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853726.93427: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853726.93442: done sending task result for task 02083763-bbaf-05ea-abc5-00000000127c 30583 1726853726.93445: WORKER PROCESS EXITING 30583 1726853726.93468: getting variables 30583 1726853726.93470: in VariableManager get_vars() 30583 1726853726.93517: Calling all_inventory to load vars for managed_node2 30583 1726853726.93522: Calling groups_inventory to load vars for managed_node2 30583 1726853726.93641: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853726.93651: Calling all_plugins_play to load vars for managed_node2 30583 1726853726.93653: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853726.93656: Calling groups_plugins_play to load vars for managed_node2 30583 1726853726.95269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853726.97140: done with get_vars() 30583 1726853726.97178: done getting variables 30583 1726853726.97319: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:26 -0400 (0:00:00.067) 0:01:02.310 ****** 30583 1726853726.97361: entering _queue_task() for managed_node2/fail 30583 1726853726.98302: worker is 1 (out of 1 available) 30583 1726853726.98315: exiting _queue_task() for managed_node2/fail 30583 1726853726.98330: done queuing things up, now waiting for results queue to drain 30583 1726853726.98447: waiting for pending results... 30583 1726853726.98965: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853726.99329: in run() - task 02083763-bbaf-05ea-abc5-00000000127d 30583 1726853726.99336: variable 'ansible_search_path' from source: unknown 30583 1726853726.99340: variable 'ansible_search_path' from source: unknown 30583 1726853726.99377: calling self._execute() 30583 1726853726.99682: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853726.99688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853726.99699: variable 'omit' from source: magic vars 30583 1726853727.00638: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.00642: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853727.00645: variable 'network_state' from source: role '' defaults 30583 1726853727.00648: Evaluated conditional (network_state != {}): False 30583 1726853727.00651: when evaluation is False, skipping this task 30583 1726853727.00653: _execute() done 30583 1726853727.00655: dumping result to json 30583 1726853727.00657: done dumping result, returning 30583 1726853727.00660: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-00000000127d] 30583 1726853727.00663: sending task result for task 02083763-bbaf-05ea-abc5-00000000127d skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853727.01087: no more pending results, returning what we have 30583 1726853727.01092: results queue empty 30583 1726853727.01093: checking for any_errors_fatal 30583 1726853727.01101: done checking for any_errors_fatal 30583 1726853727.01102: checking for max_fail_percentage 30583 1726853727.01104: done checking for max_fail_percentage 30583 1726853727.01105: checking to see if all hosts have failed and the running result is not ok 30583 1726853727.01105: done checking to see if all hosts have failed 30583 1726853727.01106: getting the remaining hosts for this loop 30583 1726853727.01108: done getting the remaining hosts for this loop 30583 1726853727.01112: getting the next task for host managed_node2 30583 1726853727.01120: done getting next task for host managed_node2 30583 1726853727.01125: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853727.01130: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853727.01161: getting variables 30583 1726853727.01163: in VariableManager get_vars() 30583 1726853727.01351: Calling all_inventory to load vars for managed_node2 30583 1726853727.01354: Calling groups_inventory to load vars for managed_node2 30583 1726853727.01356: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853727.01364: Calling all_plugins_play to load vars for managed_node2 30583 1726853727.01367: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853727.01369: Calling groups_plugins_play to load vars for managed_node2 30583 1726853727.02066: done sending task result for task 02083763-bbaf-05ea-abc5-00000000127d 30583 1726853727.02069: WORKER PROCESS EXITING 30583 1726853727.05114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853727.08505: done with get_vars() 30583 1726853727.08657: done getting variables 30583 1726853727.08719: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:27 -0400 (0:00:00.115) 0:01:02.425 ****** 30583 1726853727.08870: entering _queue_task() for managed_node2/fail 30583 1726853727.09454: worker is 1 (out of 1 available) 30583 1726853727.09468: exiting _queue_task() for managed_node2/fail 30583 1726853727.09484: done queuing things up, now waiting for results queue to drain 30583 1726853727.09485: waiting for pending results... 30583 1726853727.09832: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853727.10063: in run() - task 02083763-bbaf-05ea-abc5-00000000127e 30583 1726853727.10068: variable 'ansible_search_path' from source: unknown 30583 1726853727.10073: variable 'ansible_search_path' from source: unknown 30583 1726853727.10113: calling self._execute() 30583 1726853727.10239: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853727.10279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853727.10283: variable 'omit' from source: magic vars 30583 1726853727.10693: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.10719: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853727.10934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853727.13996: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853727.14000: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853727.14114: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853727.14153: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853727.14240: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853727.14444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.14481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.14511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.14570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.14592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.14708: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.14756: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853727.14884: variable 'ansible_distribution' from source: facts 30583 1726853727.14892: variable '__network_rh_distros' from source: role '' defaults 30583 1726853727.14909: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853727.15276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.15279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.15282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.15286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.15318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.15368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.15411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.15442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.15487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.15624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.15627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.15630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.15632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.15666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.15688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.16043: variable 'network_connections' from source: include params 30583 1726853727.16179: variable 'interface' from source: play vars 30583 1726853727.16183: variable 'interface' from source: play vars 30583 1726853727.16185: variable 'network_state' from source: role '' defaults 30583 1726853727.16239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853727.16439: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853727.16483: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853727.16532: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853727.16612: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853727.16644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853727.16670: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853727.16713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.16774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853727.16803: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853727.16829: when evaluation is False, skipping this task 30583 1726853727.16831: _execute() done 30583 1726853727.16834: dumping result to json 30583 1726853727.16844: done dumping result, returning 30583 1726853727.16938: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-00000000127e] 30583 1726853727.16942: sending task result for task 02083763-bbaf-05ea-abc5-00000000127e 30583 1726853727.17177: done sending task result for task 02083763-bbaf-05ea-abc5-00000000127e 30583 1726853727.17181: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853727.17251: no more pending results, returning what we have 30583 1726853727.17256: results queue empty 30583 1726853727.17257: checking for any_errors_fatal 30583 1726853727.17264: done checking for any_errors_fatal 30583 1726853727.17265: checking for max_fail_percentage 30583 1726853727.17267: done checking for max_fail_percentage 30583 1726853727.17268: checking to see if all hosts have failed and the running result is not ok 30583 1726853727.17269: done checking to see if all hosts have failed 30583 1726853727.17269: getting the remaining hosts for this loop 30583 1726853727.17273: done getting the remaining hosts for this loop 30583 1726853727.17278: getting the next task for host managed_node2 30583 1726853727.17287: done getting next task for host managed_node2 30583 1726853727.17292: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853727.17297: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853727.17323: getting variables 30583 1726853727.17325: in VariableManager get_vars() 30583 1726853727.17368: Calling all_inventory to load vars for managed_node2 30583 1726853727.17453: Calling groups_inventory to load vars for managed_node2 30583 1726853727.17457: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853727.17467: Calling all_plugins_play to load vars for managed_node2 30583 1726853727.17470: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853727.17476: Calling groups_plugins_play to load vars for managed_node2 30583 1726853727.19024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853727.20797: done with get_vars() 30583 1726853727.20819: done getting variables 30583 1726853727.20885: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:27 -0400 (0:00:00.120) 0:01:02.546 ****** 30583 1726853727.20919: entering _queue_task() for managed_node2/dnf 30583 1726853727.21391: worker is 1 (out of 1 available) 30583 1726853727.21402: exiting _queue_task() for managed_node2/dnf 30583 1726853727.21413: done queuing things up, now waiting for results queue to drain 30583 1726853727.21414: waiting for pending results... 30583 1726853727.21693: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853727.21842: in run() - task 02083763-bbaf-05ea-abc5-00000000127f 30583 1726853727.21864: variable 'ansible_search_path' from source: unknown 30583 1726853727.21876: variable 'ansible_search_path' from source: unknown 30583 1726853727.21929: calling self._execute() 30583 1726853727.22045: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853727.22057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853727.22078: variable 'omit' from source: magic vars 30583 1726853727.22504: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.22732: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853727.23020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853727.24849: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853727.24927: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853727.24965: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853727.24994: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853727.25018: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853727.25093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.25119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.25380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.25384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.25387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.25390: variable 'ansible_distribution' from source: facts 30583 1726853727.25392: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.25398: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853727.25443: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853727.25569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.25594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.25626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.25651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.25664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.25705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.25742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.25761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.25848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.25854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.25857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.25862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.25886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.25931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.25943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.26100: variable 'network_connections' from source: include params 30583 1726853727.26106: variable 'interface' from source: play vars 30583 1726853727.26173: variable 'interface' from source: play vars 30583 1726853727.26264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853727.26497: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853727.26502: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853727.26504: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853727.26507: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853727.26642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853727.26647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853727.26663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.26666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853727.26668: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853727.26914: variable 'network_connections' from source: include params 30583 1726853727.26917: variable 'interface' from source: play vars 30583 1726853727.26956: variable 'interface' from source: play vars 30583 1726853727.26977: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853727.26980: when evaluation is False, skipping this task 30583 1726853727.26983: _execute() done 30583 1726853727.26985: dumping result to json 30583 1726853727.26987: done dumping result, returning 30583 1726853727.27003: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-00000000127f] 30583 1726853727.27017: sending task result for task 02083763-bbaf-05ea-abc5-00000000127f 30583 1726853727.27127: done sending task result for task 02083763-bbaf-05ea-abc5-00000000127f 30583 1726853727.27130: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853727.27215: no more pending results, returning what we have 30583 1726853727.27218: results queue empty 30583 1726853727.27219: checking for any_errors_fatal 30583 1726853727.27227: done checking for any_errors_fatal 30583 1726853727.27228: checking for max_fail_percentage 30583 1726853727.27229: done checking for max_fail_percentage 30583 1726853727.27230: checking to see if all hosts have failed and the running result is not ok 30583 1726853727.27231: done checking to see if all hosts have failed 30583 1726853727.27231: getting the remaining hosts for this loop 30583 1726853727.27233: done getting the remaining hosts for this loop 30583 1726853727.27237: getting the next task for host managed_node2 30583 1726853727.27245: done getting next task for host managed_node2 30583 1726853727.27249: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853727.27255: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853727.27393: getting variables 30583 1726853727.27395: in VariableManager get_vars() 30583 1726853727.27435: Calling all_inventory to load vars for managed_node2 30583 1726853727.27438: Calling groups_inventory to load vars for managed_node2 30583 1726853727.27440: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853727.27452: Calling all_plugins_play to load vars for managed_node2 30583 1726853727.27455: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853727.27459: Calling groups_plugins_play to load vars for managed_node2 30583 1726853727.29259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853727.30153: done with get_vars() 30583 1726853727.30174: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853727.30231: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:27 -0400 (0:00:00.093) 0:01:02.639 ****** 30583 1726853727.30257: entering _queue_task() for managed_node2/yum 30583 1726853727.30645: worker is 1 (out of 1 available) 30583 1726853727.30659: exiting _queue_task() for managed_node2/yum 30583 1726853727.30877: done queuing things up, now waiting for results queue to drain 30583 1726853727.30879: waiting for pending results... 30583 1726853727.31093: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853727.31215: in run() - task 02083763-bbaf-05ea-abc5-000000001280 30583 1726853727.31219: variable 'ansible_search_path' from source: unknown 30583 1726853727.31223: variable 'ansible_search_path' from source: unknown 30583 1726853727.31249: calling self._execute() 30583 1726853727.31363: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853727.31367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853727.31374: variable 'omit' from source: magic vars 30583 1726853727.31666: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.31681: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853727.31843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853727.33623: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853727.33681: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853727.33708: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853727.33732: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853727.33756: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853727.33815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.33836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.33855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.33887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.33898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.33964: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.33981: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853727.33984: when evaluation is False, skipping this task 30583 1726853727.33987: _execute() done 30583 1726853727.33989: dumping result to json 30583 1726853727.33991: done dumping result, returning 30583 1726853727.33999: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001280] 30583 1726853727.34002: sending task result for task 02083763-bbaf-05ea-abc5-000000001280 30583 1726853727.34097: done sending task result for task 02083763-bbaf-05ea-abc5-000000001280 30583 1726853727.34102: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853727.34155: no more pending results, returning what we have 30583 1726853727.34161: results queue empty 30583 1726853727.34162: checking for any_errors_fatal 30583 1726853727.34170: done checking for any_errors_fatal 30583 1726853727.34170: checking for max_fail_percentage 30583 1726853727.34174: done checking for max_fail_percentage 30583 1726853727.34175: checking to see if all hosts have failed and the running result is not ok 30583 1726853727.34176: done checking to see if all hosts have failed 30583 1726853727.34176: getting the remaining hosts for this loop 30583 1726853727.34178: done getting the remaining hosts for this loop 30583 1726853727.34182: getting the next task for host managed_node2 30583 1726853727.34191: done getting next task for host managed_node2 30583 1726853727.34195: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853727.34200: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853727.34231: getting variables 30583 1726853727.34233: in VariableManager get_vars() 30583 1726853727.34273: Calling all_inventory to load vars for managed_node2 30583 1726853727.34276: Calling groups_inventory to load vars for managed_node2 30583 1726853727.34278: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853727.34287: Calling all_plugins_play to load vars for managed_node2 30583 1726853727.34289: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853727.34292: Calling groups_plugins_play to load vars for managed_node2 30583 1726853727.35232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853727.40289: done with get_vars() 30583 1726853727.40321: done getting variables 30583 1726853727.40378: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:27 -0400 (0:00:00.101) 0:01:02.741 ****** 30583 1726853727.40411: entering _queue_task() for managed_node2/fail 30583 1726853727.40756: worker is 1 (out of 1 available) 30583 1726853727.40770: exiting _queue_task() for managed_node2/fail 30583 1726853727.40785: done queuing things up, now waiting for results queue to drain 30583 1726853727.40787: waiting for pending results... 30583 1726853727.41296: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853727.41302: in run() - task 02083763-bbaf-05ea-abc5-000000001281 30583 1726853727.41308: variable 'ansible_search_path' from source: unknown 30583 1726853727.41312: variable 'ansible_search_path' from source: unknown 30583 1726853727.41317: calling self._execute() 30583 1726853727.41455: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853727.41462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853727.41465: variable 'omit' from source: magic vars 30583 1726853727.41868: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.41875: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853727.42001: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853727.42220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853727.44444: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853727.44503: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853727.44534: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853727.44563: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853727.44584: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853727.44645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.44666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.44689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.44713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.44724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.44762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.44778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.44796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.44820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.44830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.44863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.44882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.44897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.44923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.44932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.45053: variable 'network_connections' from source: include params 30583 1726853727.45067: variable 'interface' from source: play vars 30583 1726853727.45118: variable 'interface' from source: play vars 30583 1726853727.45171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853727.45290: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853727.45315: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853727.45339: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853727.45362: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853727.45398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853727.45410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853727.45427: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.45445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853727.45487: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853727.45644: variable 'network_connections' from source: include params 30583 1726853727.45648: variable 'interface' from source: play vars 30583 1726853727.45695: variable 'interface' from source: play vars 30583 1726853727.45715: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853727.45719: when evaluation is False, skipping this task 30583 1726853727.45721: _execute() done 30583 1726853727.45724: dumping result to json 30583 1726853727.45726: done dumping result, returning 30583 1726853727.45734: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001281] 30583 1726853727.45737: sending task result for task 02083763-bbaf-05ea-abc5-000000001281 30583 1726853727.45830: done sending task result for task 02083763-bbaf-05ea-abc5-000000001281 30583 1726853727.45833: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853727.45887: no more pending results, returning what we have 30583 1726853727.45891: results queue empty 30583 1726853727.45892: checking for any_errors_fatal 30583 1726853727.45902: done checking for any_errors_fatal 30583 1726853727.45903: checking for max_fail_percentage 30583 1726853727.45905: done checking for max_fail_percentage 30583 1726853727.45906: checking to see if all hosts have failed and the running result is not ok 30583 1726853727.45906: done checking to see if all hosts have failed 30583 1726853727.45907: getting the remaining hosts for this loop 30583 1726853727.45909: done getting the remaining hosts for this loop 30583 1726853727.45912: getting the next task for host managed_node2 30583 1726853727.45920: done getting next task for host managed_node2 30583 1726853727.45924: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853727.45928: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853727.45952: getting variables 30583 1726853727.45954: in VariableManager get_vars() 30583 1726853727.45998: Calling all_inventory to load vars for managed_node2 30583 1726853727.46001: Calling groups_inventory to load vars for managed_node2 30583 1726853727.46003: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853727.46011: Calling all_plugins_play to load vars for managed_node2 30583 1726853727.46014: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853727.46016: Calling groups_plugins_play to load vars for managed_node2 30583 1726853727.46825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853727.47712: done with get_vars() 30583 1726853727.47730: done getting variables 30583 1726853727.47775: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:27 -0400 (0:00:00.073) 0:01:02.815 ****** 30583 1726853727.47802: entering _queue_task() for managed_node2/package 30583 1726853727.48051: worker is 1 (out of 1 available) 30583 1726853727.48065: exiting _queue_task() for managed_node2/package 30583 1726853727.48080: done queuing things up, now waiting for results queue to drain 30583 1726853727.48082: waiting for pending results... 30583 1726853727.48277: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853727.48383: in run() - task 02083763-bbaf-05ea-abc5-000000001282 30583 1726853727.48394: variable 'ansible_search_path' from source: unknown 30583 1726853727.48397: variable 'ansible_search_path' from source: unknown 30583 1726853727.48430: calling self._execute() 30583 1726853727.48510: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853727.48516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853727.48528: variable 'omit' from source: magic vars 30583 1726853727.48813: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.48822: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853727.48960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853727.49158: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853727.49196: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853727.49222: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853727.49279: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853727.49365: variable 'network_packages' from source: role '' defaults 30583 1726853727.49441: variable '__network_provider_setup' from source: role '' defaults 30583 1726853727.49450: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853727.49497: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853727.49504: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853727.49549: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853727.49666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853727.51294: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853727.51333: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853727.51361: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853727.51389: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853727.51409: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853727.51466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.51491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.51509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.51534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.51545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.51584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.51600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.51616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.51642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.51651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.51795: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853727.51885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.51901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.51921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.51945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.51955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.52025: variable 'ansible_python' from source: facts 30583 1726853727.52033: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853727.52090: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853727.52145: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853727.52228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.52250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.52269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.52295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.52305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.52336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.52360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.52378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.52401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.52412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.52507: variable 'network_connections' from source: include params 30583 1726853727.52512: variable 'interface' from source: play vars 30583 1726853727.52585: variable 'interface' from source: play vars 30583 1726853727.52633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853727.52652: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853727.52678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.52702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853727.52738: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853727.52919: variable 'network_connections' from source: include params 30583 1726853727.52923: variable 'interface' from source: play vars 30583 1726853727.52994: variable 'interface' from source: play vars 30583 1726853727.53019: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853727.53075: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853727.53272: variable 'network_connections' from source: include params 30583 1726853727.53275: variable 'interface' from source: play vars 30583 1726853727.53318: variable 'interface' from source: play vars 30583 1726853727.53338: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853727.53392: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853727.53591: variable 'network_connections' from source: include params 30583 1726853727.53594: variable 'interface' from source: play vars 30583 1726853727.53638: variable 'interface' from source: play vars 30583 1726853727.53682: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853727.53723: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853727.53729: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853727.53775: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853727.53911: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853727.54215: variable 'network_connections' from source: include params 30583 1726853727.54218: variable 'interface' from source: play vars 30583 1726853727.54259: variable 'interface' from source: play vars 30583 1726853727.54267: variable 'ansible_distribution' from source: facts 30583 1726853727.54270: variable '__network_rh_distros' from source: role '' defaults 30583 1726853727.54278: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.54289: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853727.54396: variable 'ansible_distribution' from source: facts 30583 1726853727.54400: variable '__network_rh_distros' from source: role '' defaults 30583 1726853727.54402: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.54420: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853727.54524: variable 'ansible_distribution' from source: facts 30583 1726853727.54527: variable '__network_rh_distros' from source: role '' defaults 30583 1726853727.54529: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.54558: variable 'network_provider' from source: set_fact 30583 1726853727.54573: variable 'ansible_facts' from source: unknown 30583 1726853727.54941: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853727.54945: when evaluation is False, skipping this task 30583 1726853727.54949: _execute() done 30583 1726853727.54951: dumping result to json 30583 1726853727.54953: done dumping result, returning 30583 1726853727.54959: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-000000001282] 30583 1726853727.54975: sending task result for task 02083763-bbaf-05ea-abc5-000000001282 30583 1726853727.55057: done sending task result for task 02083763-bbaf-05ea-abc5-000000001282 30583 1726853727.55060: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853727.55124: no more pending results, returning what we have 30583 1726853727.55127: results queue empty 30583 1726853727.55128: checking for any_errors_fatal 30583 1726853727.55133: done checking for any_errors_fatal 30583 1726853727.55134: checking for max_fail_percentage 30583 1726853727.55136: done checking for max_fail_percentage 30583 1726853727.55137: checking to see if all hosts have failed and the running result is not ok 30583 1726853727.55137: done checking to see if all hosts have failed 30583 1726853727.55138: getting the remaining hosts for this loop 30583 1726853727.55140: done getting the remaining hosts for this loop 30583 1726853727.55143: getting the next task for host managed_node2 30583 1726853727.55151: done getting next task for host managed_node2 30583 1726853727.55155: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853727.55160: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853727.55186: getting variables 30583 1726853727.55189: in VariableManager get_vars() 30583 1726853727.55234: Calling all_inventory to load vars for managed_node2 30583 1726853727.55236: Calling groups_inventory to load vars for managed_node2 30583 1726853727.55238: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853727.55247: Calling all_plugins_play to load vars for managed_node2 30583 1726853727.55249: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853727.55252: Calling groups_plugins_play to load vars for managed_node2 30583 1726853727.56222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853727.57119: done with get_vars() 30583 1726853727.57137: done getting variables 30583 1726853727.57203: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:27 -0400 (0:00:00.094) 0:01:02.909 ****** 30583 1726853727.57231: entering _queue_task() for managed_node2/package 30583 1726853727.57489: worker is 1 (out of 1 available) 30583 1726853727.57505: exiting _queue_task() for managed_node2/package 30583 1726853727.57518: done queuing things up, now waiting for results queue to drain 30583 1726853727.57520: waiting for pending results... 30583 1726853727.57714: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853727.57820: in run() - task 02083763-bbaf-05ea-abc5-000000001283 30583 1726853727.57830: variable 'ansible_search_path' from source: unknown 30583 1726853727.57834: variable 'ansible_search_path' from source: unknown 30583 1726853727.57870: calling self._execute() 30583 1726853727.57948: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853727.57951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853727.57967: variable 'omit' from source: magic vars 30583 1726853727.58249: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.58259: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853727.58346: variable 'network_state' from source: role '' defaults 30583 1726853727.58354: Evaluated conditional (network_state != {}): False 30583 1726853727.58357: when evaluation is False, skipping this task 30583 1726853727.58360: _execute() done 30583 1726853727.58365: dumping result to json 30583 1726853727.58368: done dumping result, returning 30583 1726853727.58377: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000001283] 30583 1726853727.58381: sending task result for task 02083763-bbaf-05ea-abc5-000000001283 30583 1726853727.58475: done sending task result for task 02083763-bbaf-05ea-abc5-000000001283 30583 1726853727.58478: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853727.58546: no more pending results, returning what we have 30583 1726853727.58550: results queue empty 30583 1726853727.58551: checking for any_errors_fatal 30583 1726853727.58557: done checking for any_errors_fatal 30583 1726853727.58558: checking for max_fail_percentage 30583 1726853727.58559: done checking for max_fail_percentage 30583 1726853727.58560: checking to see if all hosts have failed and the running result is not ok 30583 1726853727.58561: done checking to see if all hosts have failed 30583 1726853727.58562: getting the remaining hosts for this loop 30583 1726853727.58564: done getting the remaining hosts for this loop 30583 1726853727.58568: getting the next task for host managed_node2 30583 1726853727.58577: done getting next task for host managed_node2 30583 1726853727.58580: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853727.58585: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853727.58607: getting variables 30583 1726853727.58609: in VariableManager get_vars() 30583 1726853727.58641: Calling all_inventory to load vars for managed_node2 30583 1726853727.58643: Calling groups_inventory to load vars for managed_node2 30583 1726853727.58645: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853727.58652: Calling all_plugins_play to load vars for managed_node2 30583 1726853727.58655: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853727.58657: Calling groups_plugins_play to load vars for managed_node2 30583 1726853727.59467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853727.60820: done with get_vars() 30583 1726853727.60838: done getting variables 30583 1726853727.60890: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:27 -0400 (0:00:00.036) 0:01:02.946 ****** 30583 1726853727.60916: entering _queue_task() for managed_node2/package 30583 1726853727.61164: worker is 1 (out of 1 available) 30583 1726853727.61180: exiting _queue_task() for managed_node2/package 30583 1726853727.61193: done queuing things up, now waiting for results queue to drain 30583 1726853727.61195: waiting for pending results... 30583 1726853727.61385: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853727.61486: in run() - task 02083763-bbaf-05ea-abc5-000000001284 30583 1726853727.61498: variable 'ansible_search_path' from source: unknown 30583 1726853727.61503: variable 'ansible_search_path' from source: unknown 30583 1726853727.61535: calling self._execute() 30583 1726853727.61612: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853727.61615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853727.61623: variable 'omit' from source: magic vars 30583 1726853727.61906: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.61916: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853727.62004: variable 'network_state' from source: role '' defaults 30583 1726853727.62011: Evaluated conditional (network_state != {}): False 30583 1726853727.62015: when evaluation is False, skipping this task 30583 1726853727.62017: _execute() done 30583 1726853727.62020: dumping result to json 30583 1726853727.62022: done dumping result, returning 30583 1726853727.62029: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000001284] 30583 1726853727.62034: sending task result for task 02083763-bbaf-05ea-abc5-000000001284 30583 1726853727.62125: done sending task result for task 02083763-bbaf-05ea-abc5-000000001284 30583 1726853727.62128: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853727.62181: no more pending results, returning what we have 30583 1726853727.62185: results queue empty 30583 1726853727.62186: checking for any_errors_fatal 30583 1726853727.62194: done checking for any_errors_fatal 30583 1726853727.62195: checking for max_fail_percentage 30583 1726853727.62197: done checking for max_fail_percentage 30583 1726853727.62198: checking to see if all hosts have failed and the running result is not ok 30583 1726853727.62198: done checking to see if all hosts have failed 30583 1726853727.62199: getting the remaining hosts for this loop 30583 1726853727.62201: done getting the remaining hosts for this loop 30583 1726853727.62204: getting the next task for host managed_node2 30583 1726853727.62212: done getting next task for host managed_node2 30583 1726853727.62216: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853727.62220: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853727.62244: getting variables 30583 1726853727.62245: in VariableManager get_vars() 30583 1726853727.62288: Calling all_inventory to load vars for managed_node2 30583 1726853727.62291: Calling groups_inventory to load vars for managed_node2 30583 1726853727.62293: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853727.62301: Calling all_plugins_play to load vars for managed_node2 30583 1726853727.62303: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853727.62306: Calling groups_plugins_play to load vars for managed_node2 30583 1726853727.63212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853727.64700: done with get_vars() 30583 1726853727.64724: done getting variables 30583 1726853727.64786: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:27 -0400 (0:00:00.039) 0:01:02.985 ****** 30583 1726853727.64827: entering _queue_task() for managed_node2/service 30583 1726853727.65181: worker is 1 (out of 1 available) 30583 1726853727.65196: exiting _queue_task() for managed_node2/service 30583 1726853727.65209: done queuing things up, now waiting for results queue to drain 30583 1726853727.65210: waiting for pending results... 30583 1726853727.65596: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853727.65718: in run() - task 02083763-bbaf-05ea-abc5-000000001285 30583 1726853727.65740: variable 'ansible_search_path' from source: unknown 30583 1726853727.65749: variable 'ansible_search_path' from source: unknown 30583 1726853727.65800: calling self._execute() 30583 1726853727.65976: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853727.65980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853727.65984: variable 'omit' from source: magic vars 30583 1726853727.66352: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.66375: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853727.66491: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853727.66692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853727.68990: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853727.69091: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853727.69135: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853727.69286: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853727.69289: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853727.69303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.69338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.69375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.69426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.69448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.69511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.69539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.69574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.69621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.69641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.69689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.69721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.69750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.69797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.69817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.70011: variable 'network_connections' from source: include params 30583 1726853727.70029: variable 'interface' from source: play vars 30583 1726853727.70154: variable 'interface' from source: play vars 30583 1726853727.70195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853727.70384: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853727.70437: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853727.70482: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853727.70517: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853727.70568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853727.70604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853727.70678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.70681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853727.70728: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853727.70988: variable 'network_connections' from source: include params 30583 1726853727.70998: variable 'interface' from source: play vars 30583 1726853727.71069: variable 'interface' from source: play vars 30583 1726853727.71100: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853727.71108: when evaluation is False, skipping this task 30583 1726853727.71114: _execute() done 30583 1726853727.71133: dumping result to json 30583 1726853727.71136: done dumping result, returning 30583 1726853727.71141: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001285] 30583 1726853727.71176: sending task result for task 02083763-bbaf-05ea-abc5-000000001285 30583 1726853727.71418: done sending task result for task 02083763-bbaf-05ea-abc5-000000001285 30583 1726853727.71503: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853727.71514: no more pending results, returning what we have 30583 1726853727.71520: results queue empty 30583 1726853727.71522: checking for any_errors_fatal 30583 1726853727.71528: done checking for any_errors_fatal 30583 1726853727.71528: checking for max_fail_percentage 30583 1726853727.71530: done checking for max_fail_percentage 30583 1726853727.71531: checking to see if all hosts have failed and the running result is not ok 30583 1726853727.71532: done checking to see if all hosts have failed 30583 1726853727.71532: getting the remaining hosts for this loop 30583 1726853727.71534: done getting the remaining hosts for this loop 30583 1726853727.71538: getting the next task for host managed_node2 30583 1726853727.71545: done getting next task for host managed_node2 30583 1726853727.71549: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853727.71553: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853727.71574: getting variables 30583 1726853727.71576: in VariableManager get_vars() 30583 1726853727.71618: Calling all_inventory to load vars for managed_node2 30583 1726853727.71621: Calling groups_inventory to load vars for managed_node2 30583 1726853727.71627: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853727.71635: Calling all_plugins_play to load vars for managed_node2 30583 1726853727.71637: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853727.71640: Calling groups_plugins_play to load vars for managed_node2 30583 1726853727.72440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853727.73445: done with get_vars() 30583 1726853727.73463: done getting variables 30583 1726853727.73507: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:27 -0400 (0:00:00.087) 0:01:03.072 ****** 30583 1726853727.73530: entering _queue_task() for managed_node2/service 30583 1726853727.73787: worker is 1 (out of 1 available) 30583 1726853727.73802: exiting _queue_task() for managed_node2/service 30583 1726853727.73817: done queuing things up, now waiting for results queue to drain 30583 1726853727.73818: waiting for pending results... 30583 1726853727.74033: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853727.74306: in run() - task 02083763-bbaf-05ea-abc5-000000001286 30583 1726853727.74311: variable 'ansible_search_path' from source: unknown 30583 1726853727.74313: variable 'ansible_search_path' from source: unknown 30583 1726853727.74316: calling self._execute() 30583 1726853727.74329: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853727.74336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853727.74346: variable 'omit' from source: magic vars 30583 1726853727.74712: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.74724: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853727.74873: variable 'network_provider' from source: set_fact 30583 1726853727.74879: variable 'network_state' from source: role '' defaults 30583 1726853727.74943: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853727.74947: variable 'omit' from source: magic vars 30583 1726853727.74950: variable 'omit' from source: magic vars 30583 1726853727.75015: variable 'network_service_name' from source: role '' defaults 30583 1726853727.75041: variable 'network_service_name' from source: role '' defaults 30583 1726853727.75273: variable '__network_provider_setup' from source: role '' defaults 30583 1726853727.75282: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853727.75286: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853727.75289: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853727.75447: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853727.75623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853727.77122: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853727.77291: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853727.77294: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853727.77303: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853727.77332: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853727.77421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.77455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.77488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.77544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.77561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.77617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.77648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.77677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.77724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.77746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.77997: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853727.78128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.78400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.78403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.78405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.78407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.78749: variable 'ansible_python' from source: facts 30583 1726853727.78752: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853727.78787: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853727.79025: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853727.79263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.79307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.79337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.79398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.79422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.79482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853727.79530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853727.79563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.79627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853727.79688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853727.79816: variable 'network_connections' from source: include params 30583 1726853727.79839: variable 'interface' from source: play vars 30583 1726853727.79925: variable 'interface' from source: play vars 30583 1726853727.80056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853727.80294: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853727.80383: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853727.80415: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853727.80467: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853727.80539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853727.80713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853727.80716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853727.80718: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853727.80763: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853727.81100: variable 'network_connections' from source: include params 30583 1726853727.81111: variable 'interface' from source: play vars 30583 1726853727.81208: variable 'interface' from source: play vars 30583 1726853727.81262: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853727.81345: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853727.81607: variable 'network_connections' from source: include params 30583 1726853727.81610: variable 'interface' from source: play vars 30583 1726853727.81661: variable 'interface' from source: play vars 30583 1726853727.81678: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853727.81734: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853727.81923: variable 'network_connections' from source: include params 30583 1726853727.81927: variable 'interface' from source: play vars 30583 1726853727.81976: variable 'interface' from source: play vars 30583 1726853727.82016: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853727.82056: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853727.82062: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853727.82104: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853727.82243: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853727.82565: variable 'network_connections' from source: include params 30583 1726853727.82568: variable 'interface' from source: play vars 30583 1726853727.82607: variable 'interface' from source: play vars 30583 1726853727.82613: variable 'ansible_distribution' from source: facts 30583 1726853727.82616: variable '__network_rh_distros' from source: role '' defaults 30583 1726853727.82623: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.82634: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853727.82751: variable 'ansible_distribution' from source: facts 30583 1726853727.82754: variable '__network_rh_distros' from source: role '' defaults 30583 1726853727.82758: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.82774: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853727.82887: variable 'ansible_distribution' from source: facts 30583 1726853727.82891: variable '__network_rh_distros' from source: role '' defaults 30583 1726853727.82895: variable 'ansible_distribution_major_version' from source: facts 30583 1726853727.82921: variable 'network_provider' from source: set_fact 30583 1726853727.82938: variable 'omit' from source: magic vars 30583 1726853727.82960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853727.82985: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853727.83038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853727.83041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853727.83044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853727.83065: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853727.83068: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853727.83074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853727.83160: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853727.83169: Set connection var ansible_timeout to 10 30583 1726853727.83174: Set connection var ansible_connection to ssh 30583 1726853727.83178: Set connection var ansible_shell_executable to /bin/sh 30583 1726853727.83182: Set connection var ansible_shell_type to sh 30583 1726853727.83224: Set connection var ansible_pipelining to False 30583 1726853727.83228: variable 'ansible_shell_executable' from source: unknown 30583 1726853727.83230: variable 'ansible_connection' from source: unknown 30583 1726853727.83233: variable 'ansible_module_compression' from source: unknown 30583 1726853727.83235: variable 'ansible_shell_type' from source: unknown 30583 1726853727.83237: variable 'ansible_shell_executable' from source: unknown 30583 1726853727.83239: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853727.83241: variable 'ansible_pipelining' from source: unknown 30583 1726853727.83243: variable 'ansible_timeout' from source: unknown 30583 1726853727.83309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853727.83433: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853727.83442: variable 'omit' from source: magic vars 30583 1726853727.83444: starting attempt loop 30583 1726853727.83446: running the handler 30583 1726853727.83704: variable 'ansible_facts' from source: unknown 30583 1726853727.84229: _low_level_execute_command(): starting 30583 1726853727.84232: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853727.84900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853727.84919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853727.84922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853727.84935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853727.84947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853727.84955: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853727.84966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853727.84983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853727.84991: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853727.84997: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853727.85005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853727.85014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853727.85026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853727.85040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853727.85043: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853727.85049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853727.85168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853727.85260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853727.85264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853727.85472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853727.87192: stdout chunk (state=3): >>>/root <<< 30583 1726853727.87380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853727.87384: stdout chunk (state=3): >>><<< 30583 1726853727.87386: stderr chunk (state=3): >>><<< 30583 1726853727.87390: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853727.87392: _low_level_execute_command(): starting 30583 1726853727.87396: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462 `" && echo ansible-tmp-1726853727.8735404-33585-202896591692462="` echo /root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462 `" ) && sleep 0' 30583 1726853727.87958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853727.88051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853727.88055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853727.88058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853727.88060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853727.88062: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853727.88064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853727.88066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853727.88068: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853727.88072: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853727.88074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853727.88077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853727.88079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853727.88081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853727.88083: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853727.88085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853727.88165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853727.88168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853727.88197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853727.88297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853727.90291: stdout chunk (state=3): >>>ansible-tmp-1726853727.8735404-33585-202896591692462=/root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462 <<< 30583 1726853727.90437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853727.90447: stdout chunk (state=3): >>><<< 30583 1726853727.90475: stderr chunk (state=3): >>><<< 30583 1726853727.90497: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853727.8735404-33585-202896591692462=/root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853727.90540: variable 'ansible_module_compression' from source: unknown 30583 1726853727.90609: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853727.90677: variable 'ansible_facts' from source: unknown 30583 1726853727.90990: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462/AnsiballZ_systemd.py 30583 1726853727.91151: Sending initial data 30583 1726853727.91154: Sent initial data (156 bytes) 30583 1726853727.91778: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853727.91821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853727.91841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853727.91878: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853727.91997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853727.93800: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853727.93874: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853727.93947: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpl6y8nkxv /root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462/AnsiballZ_systemd.py <<< 30583 1726853727.93950: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462/AnsiballZ_systemd.py" <<< 30583 1726853727.94001: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpl6y8nkxv" to remote "/root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462/AnsiballZ_systemd.py" <<< 30583 1726853727.95667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853727.95852: stderr chunk (state=3): >>><<< 30583 1726853727.95855: stdout chunk (state=3): >>><<< 30583 1726853727.95860: done transferring module to remote 30583 1726853727.95862: _low_level_execute_command(): starting 30583 1726853727.95865: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462/ /root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462/AnsiballZ_systemd.py && sleep 0' 30583 1726853727.96438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853727.96442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853727.96466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853727.96569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853727.98682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853727.98687: stdout chunk (state=3): >>><<< 30583 1726853727.98690: stderr chunk (state=3): >>><<< 30583 1726853727.98874: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853727.98878: _low_level_execute_command(): starting 30583 1726853727.98881: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462/AnsiballZ_systemd.py && sleep 0' 30583 1726853727.99411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853727.99425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853727.99440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853727.99457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853727.99480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853727.99586: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853727.99604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853727.99618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853727.99726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853728.29738: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4653056", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300118528", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1878026000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853728.31710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853728.31793: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853728.31796: stdout chunk (state=3): >>><<< 30583 1726853728.31798: stderr chunk (state=3): >>><<< 30583 1726853728.31978: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4653056", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300118528", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1878026000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853728.32027: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853728.32053: _low_level_execute_command(): starting 30583 1726853728.32066: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853727.8735404-33585-202896591692462/ > /dev/null 2>&1 && sleep 0' 30583 1726853728.32885: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853728.32907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853728.33012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853728.34942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853728.34982: stdout chunk (state=3): >>><<< 30583 1726853728.34994: stderr chunk (state=3): >>><<< 30583 1726853728.35022: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853728.35116: handler run complete 30583 1726853728.35279: attempt loop complete, returning result 30583 1726853728.35282: _execute() done 30583 1726853728.35285: dumping result to json 30583 1726853728.35303: done dumping result, returning 30583 1726853728.35316: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-000000001286] 30583 1726853728.35362: sending task result for task 02083763-bbaf-05ea-abc5-000000001286 30583 1726853728.36263: done sending task result for task 02083763-bbaf-05ea-abc5-000000001286 30583 1726853728.36267: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853728.36330: no more pending results, returning what we have 30583 1726853728.36334: results queue empty 30583 1726853728.36335: checking for any_errors_fatal 30583 1726853728.36341: done checking for any_errors_fatal 30583 1726853728.36342: checking for max_fail_percentage 30583 1726853728.36345: done checking for max_fail_percentage 30583 1726853728.36346: checking to see if all hosts have failed and the running result is not ok 30583 1726853728.36347: done checking to see if all hosts have failed 30583 1726853728.36348: getting the remaining hosts for this loop 30583 1726853728.36350: done getting the remaining hosts for this loop 30583 1726853728.36353: getting the next task for host managed_node2 30583 1726853728.36370: done getting next task for host managed_node2 30583 1726853728.36376: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853728.36381: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853728.36395: getting variables 30583 1726853728.36397: in VariableManager get_vars() 30583 1726853728.36434: Calling all_inventory to load vars for managed_node2 30583 1726853728.36437: Calling groups_inventory to load vars for managed_node2 30583 1726853728.36440: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853728.36451: Calling all_plugins_play to load vars for managed_node2 30583 1726853728.36454: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853728.36461: Calling groups_plugins_play to load vars for managed_node2 30583 1726853728.39796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853728.43369: done with get_vars() 30583 1726853728.43437: done getting variables 30583 1726853728.43523: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:35:28 -0400 (0:00:00.700) 0:01:03.772 ****** 30583 1726853728.43579: entering _queue_task() for managed_node2/service 30583 1726853728.44024: worker is 1 (out of 1 available) 30583 1726853728.44039: exiting _queue_task() for managed_node2/service 30583 1726853728.44166: done queuing things up, now waiting for results queue to drain 30583 1726853728.44168: waiting for pending results... 30583 1726853728.44407: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853728.44652: in run() - task 02083763-bbaf-05ea-abc5-000000001287 30583 1726853728.44657: variable 'ansible_search_path' from source: unknown 30583 1726853728.44662: variable 'ansible_search_path' from source: unknown 30583 1726853728.44686: calling self._execute() 30583 1726853728.44981: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853728.44986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853728.44990: variable 'omit' from source: magic vars 30583 1726853728.45880: variable 'ansible_distribution_major_version' from source: facts 30583 1726853728.45979: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853728.46244: variable 'network_provider' from source: set_fact 30583 1726853728.46248: Evaluated conditional (network_provider == "nm"): True 30583 1726853728.46470: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853728.46700: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853728.47013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853728.52617: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853728.52747: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853728.52978: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853728.52981: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853728.53025: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853728.53269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853728.53276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853728.53352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853728.53404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853728.53472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853728.53598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853728.53674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853728.53711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853728.53820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853728.53840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853728.53955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853728.53992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853728.54022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853728.54077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853728.54095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853728.54363: variable 'network_connections' from source: include params 30583 1726853728.54366: variable 'interface' from source: play vars 30583 1726853728.54469: variable 'interface' from source: play vars 30583 1726853728.54579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853728.54782: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853728.54829: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853728.54874: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853728.54912: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853728.54960: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853728.54994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853728.55030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853728.55064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853728.55125: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853728.55412: variable 'network_connections' from source: include params 30583 1726853728.55415: variable 'interface' from source: play vars 30583 1726853728.55481: variable 'interface' from source: play vars 30583 1726853728.55556: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853728.55561: when evaluation is False, skipping this task 30583 1726853728.55564: _execute() done 30583 1726853728.55566: dumping result to json 30583 1726853728.55568: done dumping result, returning 30583 1726853728.55570: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-000000001287] 30583 1726853728.55584: sending task result for task 02083763-bbaf-05ea-abc5-000000001287 30583 1726853728.55783: done sending task result for task 02083763-bbaf-05ea-abc5-000000001287 30583 1726853728.55786: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853728.55838: no more pending results, returning what we have 30583 1726853728.55901: results queue empty 30583 1726853728.55902: checking for any_errors_fatal 30583 1726853728.55927: done checking for any_errors_fatal 30583 1726853728.55928: checking for max_fail_percentage 30583 1726853728.55931: done checking for max_fail_percentage 30583 1726853728.55932: checking to see if all hosts have failed and the running result is not ok 30583 1726853728.55932: done checking to see if all hosts have failed 30583 1726853728.55933: getting the remaining hosts for this loop 30583 1726853728.55935: done getting the remaining hosts for this loop 30583 1726853728.55939: getting the next task for host managed_node2 30583 1726853728.55947: done getting next task for host managed_node2 30583 1726853728.56182: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853728.56187: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853728.56209: getting variables 30583 1726853728.56211: in VariableManager get_vars() 30583 1726853728.56246: Calling all_inventory to load vars for managed_node2 30583 1726853728.56249: Calling groups_inventory to load vars for managed_node2 30583 1726853728.56251: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853728.56262: Calling all_plugins_play to load vars for managed_node2 30583 1726853728.56265: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853728.56268: Calling groups_plugins_play to load vars for managed_node2 30583 1726853728.58111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853728.59814: done with get_vars() 30583 1726853728.59840: done getting variables 30583 1726853728.59910: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:35:28 -0400 (0:00:00.163) 0:01:03.936 ****** 30583 1726853728.59941: entering _queue_task() for managed_node2/service 30583 1726853728.60494: worker is 1 (out of 1 available) 30583 1726853728.60505: exiting _queue_task() for managed_node2/service 30583 1726853728.60517: done queuing things up, now waiting for results queue to drain 30583 1726853728.60518: waiting for pending results... 30583 1726853728.60762: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853728.60840: in run() - task 02083763-bbaf-05ea-abc5-000000001288 30583 1726853728.60875: variable 'ansible_search_path' from source: unknown 30583 1726853728.60885: variable 'ansible_search_path' from source: unknown 30583 1726853728.60927: calling self._execute() 30583 1726853728.61045: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853728.61078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853728.61084: variable 'omit' from source: magic vars 30583 1726853728.61488: variable 'ansible_distribution_major_version' from source: facts 30583 1726853728.61576: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853728.61769: variable 'network_provider' from source: set_fact 30583 1726853728.61783: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853728.62183: when evaluation is False, skipping this task 30583 1726853728.62186: _execute() done 30583 1726853728.62189: dumping result to json 30583 1726853728.62192: done dumping result, returning 30583 1726853728.62195: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-000000001288] 30583 1726853728.62197: sending task result for task 02083763-bbaf-05ea-abc5-000000001288 30583 1726853728.62274: done sending task result for task 02083763-bbaf-05ea-abc5-000000001288 30583 1726853728.62277: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853728.62331: no more pending results, returning what we have 30583 1726853728.62335: results queue empty 30583 1726853728.62337: checking for any_errors_fatal 30583 1726853728.62348: done checking for any_errors_fatal 30583 1726853728.62349: checking for max_fail_percentage 30583 1726853728.62352: done checking for max_fail_percentage 30583 1726853728.62353: checking to see if all hosts have failed and the running result is not ok 30583 1726853728.62354: done checking to see if all hosts have failed 30583 1726853728.62354: getting the remaining hosts for this loop 30583 1726853728.62356: done getting the remaining hosts for this loop 30583 1726853728.62363: getting the next task for host managed_node2 30583 1726853728.62375: done getting next task for host managed_node2 30583 1726853728.62380: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853728.62386: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853728.62422: getting variables 30583 1726853728.62425: in VariableManager get_vars() 30583 1726853728.62787: Calling all_inventory to load vars for managed_node2 30583 1726853728.62791: Calling groups_inventory to load vars for managed_node2 30583 1726853728.62794: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853728.62804: Calling all_plugins_play to load vars for managed_node2 30583 1726853728.62807: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853728.62810: Calling groups_plugins_play to load vars for managed_node2 30583 1726853728.65606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853728.67386: done with get_vars() 30583 1726853728.67415: done getting variables 30583 1726853728.67486: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:35:28 -0400 (0:00:00.075) 0:01:04.012 ****** 30583 1726853728.67522: entering _queue_task() for managed_node2/copy 30583 1726853728.68074: worker is 1 (out of 1 available) 30583 1726853728.68086: exiting _queue_task() for managed_node2/copy 30583 1726853728.68098: done queuing things up, now waiting for results queue to drain 30583 1726853728.68101: waiting for pending results... 30583 1726853728.68796: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853728.68941: in run() - task 02083763-bbaf-05ea-abc5-000000001289 30583 1726853728.68956: variable 'ansible_search_path' from source: unknown 30583 1726853728.68962: variable 'ansible_search_path' from source: unknown 30583 1726853728.69109: calling self._execute() 30583 1726853728.69251: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853728.69254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853728.69266: variable 'omit' from source: magic vars 30583 1726853728.70165: variable 'ansible_distribution_major_version' from source: facts 30583 1726853728.70186: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853728.70297: variable 'network_provider' from source: set_fact 30583 1726853728.70304: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853728.70307: when evaluation is False, skipping this task 30583 1726853728.70309: _execute() done 30583 1726853728.70313: dumping result to json 30583 1726853728.70315: done dumping result, returning 30583 1726853728.70325: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-000000001289] 30583 1726853728.70329: sending task result for task 02083763-bbaf-05ea-abc5-000000001289 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853728.70593: no more pending results, returning what we have 30583 1726853728.70596: results queue empty 30583 1726853728.70597: checking for any_errors_fatal 30583 1726853728.70602: done checking for any_errors_fatal 30583 1726853728.70603: checking for max_fail_percentage 30583 1726853728.70604: done checking for max_fail_percentage 30583 1726853728.70605: checking to see if all hosts have failed and the running result is not ok 30583 1726853728.70606: done checking to see if all hosts have failed 30583 1726853728.70607: getting the remaining hosts for this loop 30583 1726853728.70608: done getting the remaining hosts for this loop 30583 1726853728.70611: getting the next task for host managed_node2 30583 1726853728.70618: done getting next task for host managed_node2 30583 1726853728.70622: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853728.70626: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853728.70646: getting variables 30583 1726853728.70648: in VariableManager get_vars() 30583 1726853728.70691: Calling all_inventory to load vars for managed_node2 30583 1726853728.70693: Calling groups_inventory to load vars for managed_node2 30583 1726853728.70695: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853728.70704: Calling all_plugins_play to load vars for managed_node2 30583 1726853728.70706: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853728.70722: done sending task result for task 02083763-bbaf-05ea-abc5-000000001289 30583 1726853728.70725: WORKER PROCESS EXITING 30583 1726853728.70730: Calling groups_plugins_play to load vars for managed_node2 30583 1726853728.72210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853728.73915: done with get_vars() 30583 1726853728.73944: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:35:28 -0400 (0:00:00.065) 0:01:04.077 ****** 30583 1726853728.74039: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853728.74591: worker is 1 (out of 1 available) 30583 1726853728.74601: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853728.74612: done queuing things up, now waiting for results queue to drain 30583 1726853728.74613: waiting for pending results... 30583 1726853728.74752: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853728.74928: in run() - task 02083763-bbaf-05ea-abc5-00000000128a 30583 1726853728.74955: variable 'ansible_search_path' from source: unknown 30583 1726853728.74965: variable 'ansible_search_path' from source: unknown 30583 1726853728.75007: calling self._execute() 30583 1726853728.75115: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853728.75127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853728.75142: variable 'omit' from source: magic vars 30583 1726853728.75602: variable 'ansible_distribution_major_version' from source: facts 30583 1726853728.75606: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853728.75607: variable 'omit' from source: magic vars 30583 1726853728.75635: variable 'omit' from source: magic vars 30583 1726853728.75784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853728.78038: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853728.78116: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853728.78160: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853728.78199: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853728.78237: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853728.78330: variable 'network_provider' from source: set_fact 30583 1726853728.78541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853728.78544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853728.78547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853728.78581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853728.78600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853728.78687: variable 'omit' from source: magic vars 30583 1726853728.78804: variable 'omit' from source: magic vars 30583 1726853728.78921: variable 'network_connections' from source: include params 30583 1726853728.78937: variable 'interface' from source: play vars 30583 1726853728.79009: variable 'interface' from source: play vars 30583 1726853728.79168: variable 'omit' from source: magic vars 30583 1726853728.79183: variable '__lsr_ansible_managed' from source: task vars 30583 1726853728.79249: variable '__lsr_ansible_managed' from source: task vars 30583 1726853728.79455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853728.79875: Loaded config def from plugin (lookup/template) 30583 1726853728.79879: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853728.79881: File lookup term: get_ansible_managed.j2 30583 1726853728.79883: variable 'ansible_search_path' from source: unknown 30583 1726853728.79886: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853728.79890: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853728.79893: variable 'ansible_search_path' from source: unknown 30583 1726853728.86144: variable 'ansible_managed' from source: unknown 30583 1726853728.86285: variable 'omit' from source: magic vars 30583 1726853728.86322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853728.86350: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853728.86378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853728.86397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853728.86416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853728.86452: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853728.86464: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853728.86474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853728.86615: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853728.86636: Set connection var ansible_timeout to 10 30583 1726853728.86644: Set connection var ansible_connection to ssh 30583 1726853728.86654: Set connection var ansible_shell_executable to /bin/sh 30583 1726853728.86711: Set connection var ansible_shell_type to sh 30583 1726853728.86714: Set connection var ansible_pipelining to False 30583 1726853728.86716: variable 'ansible_shell_executable' from source: unknown 30583 1726853728.86718: variable 'ansible_connection' from source: unknown 30583 1726853728.86724: variable 'ansible_module_compression' from source: unknown 30583 1726853728.86733: variable 'ansible_shell_type' from source: unknown 30583 1726853728.86745: variable 'ansible_shell_executable' from source: unknown 30583 1726853728.86820: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853728.86823: variable 'ansible_pipelining' from source: unknown 30583 1726853728.86825: variable 'ansible_timeout' from source: unknown 30583 1726853728.86827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853728.86934: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853728.86972: variable 'omit' from source: magic vars 30583 1726853728.86984: starting attempt loop 30583 1726853728.86991: running the handler 30583 1726853728.87011: _low_level_execute_command(): starting 30583 1726853728.87023: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853728.87928: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853728.87947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853728.87980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853728.88008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853728.88108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853728.89888: stdout chunk (state=3): >>>/root <<< 30583 1726853728.90069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853728.90074: stdout chunk (state=3): >>><<< 30583 1726853728.90077: stderr chunk (state=3): >>><<< 30583 1726853728.90222: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853728.90226: _low_level_execute_command(): starting 30583 1726853728.90229: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065 `" && echo ansible-tmp-1726853728.9010952-33626-26667901254065="` echo /root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065 `" ) && sleep 0' 30583 1726853728.90952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853728.91037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853728.91066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853728.91097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853728.91204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853728.93256: stdout chunk (state=3): >>>ansible-tmp-1726853728.9010952-33626-26667901254065=/root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065 <<< 30583 1726853728.93418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853728.93422: stdout chunk (state=3): >>><<< 30583 1726853728.93424: stderr chunk (state=3): >>><<< 30583 1726853728.93577: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853728.9010952-33626-26667901254065=/root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853728.93581: variable 'ansible_module_compression' from source: unknown 30583 1726853728.93584: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853728.93597: variable 'ansible_facts' from source: unknown 30583 1726853728.93724: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065/AnsiballZ_network_connections.py 30583 1726853728.93940: Sending initial data 30583 1726853728.93943: Sent initial data (167 bytes) 30583 1726853728.94594: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853728.94634: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853728.94703: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853728.94774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853728.94799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853728.94835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853728.94946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853728.96622: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853728.96693: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853728.96765: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp1ppw2bq6 /root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065/AnsiballZ_network_connections.py <<< 30583 1726853728.96768: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065/AnsiballZ_network_connections.py" <<< 30583 1726853728.96830: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp1ppw2bq6" to remote "/root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065/AnsiballZ_network_connections.py" <<< 30583 1726853728.98184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853728.98282: stderr chunk (state=3): >>><<< 30583 1726853728.98285: stdout chunk (state=3): >>><<< 30583 1726853728.98287: done transferring module to remote 30583 1726853728.98289: _low_level_execute_command(): starting 30583 1726853728.98292: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065/ /root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065/AnsiballZ_network_connections.py && sleep 0' 30583 1726853728.99131: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853728.99147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853728.99295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853728.99321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853728.99434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853729.01383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853729.01389: stdout chunk (state=3): >>><<< 30583 1726853729.01391: stderr chunk (state=3): >>><<< 30583 1726853729.01510: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853729.01517: _low_level_execute_command(): starting 30583 1726853729.01520: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065/AnsiballZ_network_connections.py && sleep 0' 30583 1726853729.02222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853729.02238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853729.02254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853729.02278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853729.02319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853729.02438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853729.02520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853729.02599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853729.28519: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30583 1726853729.30626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853729.30632: stdout chunk (state=3): >>><<< 30583 1726853729.30636: stderr chunk (state=3): >>><<< 30583 1726853729.30640: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853729.30644: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853729.30647: _low_level_execute_command(): starting 30583 1726853729.30649: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853728.9010952-33626-26667901254065/ > /dev/null 2>&1 && sleep 0' 30583 1726853729.31305: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853729.31452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853729.31625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853729.31665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853729.31762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853729.33808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853729.33836: stderr chunk (state=3): >>><<< 30583 1726853729.33838: stdout chunk (state=3): >>><<< 30583 1726853729.33851: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853729.33875: handler run complete 30583 1726853729.33885: attempt loop complete, returning result 30583 1726853729.33890: _execute() done 30583 1726853729.33894: dumping result to json 30583 1726853729.33901: done dumping result, returning 30583 1726853729.33936: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-00000000128a] 30583 1726853729.33939: sending task result for task 02083763-bbaf-05ea-abc5-00000000128a 30583 1726853729.34014: done sending task result for task 02083763-bbaf-05ea-abc5-00000000128a 30583 1726853729.34017: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840 skipped because already active 30583 1726853729.34134: no more pending results, returning what we have 30583 1726853729.34137: results queue empty 30583 1726853729.34138: checking for any_errors_fatal 30583 1726853729.34146: done checking for any_errors_fatal 30583 1726853729.34146: checking for max_fail_percentage 30583 1726853729.34148: done checking for max_fail_percentage 30583 1726853729.34149: checking to see if all hosts have failed and the running result is not ok 30583 1726853729.34150: done checking to see if all hosts have failed 30583 1726853729.34155: getting the remaining hosts for this loop 30583 1726853729.34157: done getting the remaining hosts for this loop 30583 1726853729.34160: getting the next task for host managed_node2 30583 1726853729.34167: done getting next task for host managed_node2 30583 1726853729.34172: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853729.34176: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853729.34190: getting variables 30583 1726853729.34191: in VariableManager get_vars() 30583 1726853729.34229: Calling all_inventory to load vars for managed_node2 30583 1726853729.34231: Calling groups_inventory to load vars for managed_node2 30583 1726853729.34233: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853729.34243: Calling all_plugins_play to load vars for managed_node2 30583 1726853729.34246: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853729.34248: Calling groups_plugins_play to load vars for managed_node2 30583 1726853729.35643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853729.37723: done with get_vars() 30583 1726853729.37741: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:35:29 -0400 (0:00:00.637) 0:01:04.715 ****** 30583 1726853729.37808: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853729.38069: worker is 1 (out of 1 available) 30583 1726853729.38087: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853729.38101: done queuing things up, now waiting for results queue to drain 30583 1726853729.38102: waiting for pending results... 30583 1726853729.38293: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853729.38391: in run() - task 02083763-bbaf-05ea-abc5-00000000128b 30583 1726853729.38403: variable 'ansible_search_path' from source: unknown 30583 1726853729.38408: variable 'ansible_search_path' from source: unknown 30583 1726853729.38436: calling self._execute() 30583 1726853729.38517: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853729.38522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853729.38530: variable 'omit' from source: magic vars 30583 1726853729.38997: variable 'ansible_distribution_major_version' from source: facts 30583 1726853729.39000: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853729.39054: variable 'network_state' from source: role '' defaults 30583 1726853729.39065: Evaluated conditional (network_state != {}): False 30583 1726853729.39068: when evaluation is False, skipping this task 30583 1726853729.39073: _execute() done 30583 1726853729.39076: dumping result to json 30583 1726853729.39079: done dumping result, returning 30583 1726853729.39087: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-00000000128b] 30583 1726853729.39090: sending task result for task 02083763-bbaf-05ea-abc5-00000000128b 30583 1726853729.39182: done sending task result for task 02083763-bbaf-05ea-abc5-00000000128b 30583 1726853729.39185: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853729.39253: no more pending results, returning what we have 30583 1726853729.39260: results queue empty 30583 1726853729.39261: checking for any_errors_fatal 30583 1726853729.39276: done checking for any_errors_fatal 30583 1726853729.39277: checking for max_fail_percentage 30583 1726853729.39279: done checking for max_fail_percentage 30583 1726853729.39280: checking to see if all hosts have failed and the running result is not ok 30583 1726853729.39280: done checking to see if all hosts have failed 30583 1726853729.39281: getting the remaining hosts for this loop 30583 1726853729.39283: done getting the remaining hosts for this loop 30583 1726853729.39286: getting the next task for host managed_node2 30583 1726853729.39293: done getting next task for host managed_node2 30583 1726853729.39296: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853729.39301: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853729.39323: getting variables 30583 1726853729.39324: in VariableManager get_vars() 30583 1726853729.39355: Calling all_inventory to load vars for managed_node2 30583 1726853729.39360: Calling groups_inventory to load vars for managed_node2 30583 1726853729.39362: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853729.39369: Calling all_plugins_play to load vars for managed_node2 30583 1726853729.39434: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853729.39439: Calling groups_plugins_play to load vars for managed_node2 30583 1726853729.40541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853729.42048: done with get_vars() 30583 1726853729.42078: done getting variables 30583 1726853729.42140: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:35:29 -0400 (0:00:00.043) 0:01:04.759 ****** 30583 1726853729.42182: entering _queue_task() for managed_node2/debug 30583 1726853729.42534: worker is 1 (out of 1 available) 30583 1726853729.42549: exiting _queue_task() for managed_node2/debug 30583 1726853729.42566: done queuing things up, now waiting for results queue to drain 30583 1726853729.42568: waiting for pending results... 30583 1726853729.42901: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853729.43002: in run() - task 02083763-bbaf-05ea-abc5-00000000128c 30583 1726853729.43016: variable 'ansible_search_path' from source: unknown 30583 1726853729.43020: variable 'ansible_search_path' from source: unknown 30583 1726853729.43105: calling self._execute() 30583 1726853729.43202: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853729.43209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853729.43216: variable 'omit' from source: magic vars 30583 1726853729.43863: variable 'ansible_distribution_major_version' from source: facts 30583 1726853729.43872: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853729.43879: variable 'omit' from source: magic vars 30583 1726853729.43955: variable 'omit' from source: magic vars 30583 1726853729.43993: variable 'omit' from source: magic vars 30583 1726853729.44040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853729.44076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853729.44134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853729.44137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853729.44140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853729.44176: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853729.44180: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853729.44184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853729.44289: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853729.44374: Set connection var ansible_timeout to 10 30583 1726853729.44378: Set connection var ansible_connection to ssh 30583 1726853729.44381: Set connection var ansible_shell_executable to /bin/sh 30583 1726853729.44383: Set connection var ansible_shell_type to sh 30583 1726853729.44385: Set connection var ansible_pipelining to False 30583 1726853729.44387: variable 'ansible_shell_executable' from source: unknown 30583 1726853729.44389: variable 'ansible_connection' from source: unknown 30583 1726853729.44391: variable 'ansible_module_compression' from source: unknown 30583 1726853729.44393: variable 'ansible_shell_type' from source: unknown 30583 1726853729.44395: variable 'ansible_shell_executable' from source: unknown 30583 1726853729.44397: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853729.44399: variable 'ansible_pipelining' from source: unknown 30583 1726853729.44401: variable 'ansible_timeout' from source: unknown 30583 1726853729.44403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853729.44514: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853729.44576: variable 'omit' from source: magic vars 30583 1726853729.44579: starting attempt loop 30583 1726853729.44582: running the handler 30583 1726853729.44664: variable '__network_connections_result' from source: set_fact 30583 1726853729.44777: handler run complete 30583 1726853729.44780: attempt loop complete, returning result 30583 1726853729.44782: _execute() done 30583 1726853729.44785: dumping result to json 30583 1726853729.44787: done dumping result, returning 30583 1726853729.44790: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-00000000128c] 30583 1726853729.44792: sending task result for task 02083763-bbaf-05ea-abc5-00000000128c 30583 1726853729.45076: done sending task result for task 02083763-bbaf-05ea-abc5-00000000128c 30583 1726853729.45079: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840 skipped because already active" ] } 30583 1726853729.45134: no more pending results, returning what we have 30583 1726853729.45137: results queue empty 30583 1726853729.45138: checking for any_errors_fatal 30583 1726853729.45143: done checking for any_errors_fatal 30583 1726853729.45144: checking for max_fail_percentage 30583 1726853729.45145: done checking for max_fail_percentage 30583 1726853729.45146: checking to see if all hosts have failed and the running result is not ok 30583 1726853729.45147: done checking to see if all hosts have failed 30583 1726853729.45148: getting the remaining hosts for this loop 30583 1726853729.45149: done getting the remaining hosts for this loop 30583 1726853729.45152: getting the next task for host managed_node2 30583 1726853729.45162: done getting next task for host managed_node2 30583 1726853729.45165: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853729.45170: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853729.45184: getting variables 30583 1726853729.45186: in VariableManager get_vars() 30583 1726853729.45218: Calling all_inventory to load vars for managed_node2 30583 1726853729.45221: Calling groups_inventory to load vars for managed_node2 30583 1726853729.45224: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853729.45232: Calling all_plugins_play to load vars for managed_node2 30583 1726853729.45234: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853729.45237: Calling groups_plugins_play to load vars for managed_node2 30583 1726853729.46742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853729.48288: done with get_vars() 30583 1726853729.48315: done getting variables 30583 1726853729.48379: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:35:29 -0400 (0:00:00.062) 0:01:04.821 ****** 30583 1726853729.48421: entering _queue_task() for managed_node2/debug 30583 1726853729.48779: worker is 1 (out of 1 available) 30583 1726853729.48793: exiting _queue_task() for managed_node2/debug 30583 1726853729.48805: done queuing things up, now waiting for results queue to drain 30583 1726853729.48807: waiting for pending results... 30583 1726853729.49110: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853729.49281: in run() - task 02083763-bbaf-05ea-abc5-00000000128d 30583 1726853729.49303: variable 'ansible_search_path' from source: unknown 30583 1726853729.49311: variable 'ansible_search_path' from source: unknown 30583 1726853729.49361: calling self._execute() 30583 1726853729.49470: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853729.49485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853729.49500: variable 'omit' from source: magic vars 30583 1726853729.49896: variable 'ansible_distribution_major_version' from source: facts 30583 1726853729.49913: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853729.49924: variable 'omit' from source: magic vars 30583 1726853729.49999: variable 'omit' from source: magic vars 30583 1726853729.50036: variable 'omit' from source: magic vars 30583 1726853729.50090: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853729.50176: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853729.50179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853729.50181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853729.50191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853729.50228: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853729.50237: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853729.50245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853729.50352: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853729.50367: Set connection var ansible_timeout to 10 30583 1726853729.50377: Set connection var ansible_connection to ssh 30583 1726853729.50388: Set connection var ansible_shell_executable to /bin/sh 30583 1726853729.50396: Set connection var ansible_shell_type to sh 30583 1726853729.50421: Set connection var ansible_pipelining to False 30583 1726853729.50442: variable 'ansible_shell_executable' from source: unknown 30583 1726853729.50531: variable 'ansible_connection' from source: unknown 30583 1726853729.50534: variable 'ansible_module_compression' from source: unknown 30583 1726853729.50536: variable 'ansible_shell_type' from source: unknown 30583 1726853729.50539: variable 'ansible_shell_executable' from source: unknown 30583 1726853729.50540: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853729.50542: variable 'ansible_pipelining' from source: unknown 30583 1726853729.50544: variable 'ansible_timeout' from source: unknown 30583 1726853729.50546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853729.50628: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853729.50647: variable 'omit' from source: magic vars 30583 1726853729.50656: starting attempt loop 30583 1726853729.50664: running the handler 30583 1726853729.50715: variable '__network_connections_result' from source: set_fact 30583 1726853729.50799: variable '__network_connections_result' from source: set_fact 30583 1726853729.50924: handler run complete 30583 1726853729.50955: attempt loop complete, returning result 30583 1726853729.50972: _execute() done 30583 1726853729.50980: dumping result to json 30583 1726853729.50989: done dumping result, returning 30583 1726853729.51002: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-00000000128d] 30583 1726853729.51077: sending task result for task 02083763-bbaf-05ea-abc5-00000000128d 30583 1726853729.51154: done sending task result for task 02083763-bbaf-05ea-abc5-00000000128d 30583 1726853729.51160: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 11d9efea-f4e2-4de6-9b17-bfa7490d4840 skipped because already active" ] } } 30583 1726853729.51289: no more pending results, returning what we have 30583 1726853729.51294: results queue empty 30583 1726853729.51295: checking for any_errors_fatal 30583 1726853729.51304: done checking for any_errors_fatal 30583 1726853729.51305: checking for max_fail_percentage 30583 1726853729.51307: done checking for max_fail_percentage 30583 1726853729.51308: checking to see if all hosts have failed and the running result is not ok 30583 1726853729.51309: done checking to see if all hosts have failed 30583 1726853729.51310: getting the remaining hosts for this loop 30583 1726853729.51312: done getting the remaining hosts for this loop 30583 1726853729.51316: getting the next task for host managed_node2 30583 1726853729.51324: done getting next task for host managed_node2 30583 1726853729.51329: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853729.51333: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853729.51349: getting variables 30583 1726853729.51351: in VariableManager get_vars() 30583 1726853729.51598: Calling all_inventory to load vars for managed_node2 30583 1726853729.51601: Calling groups_inventory to load vars for managed_node2 30583 1726853729.51610: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853729.51620: Calling all_plugins_play to load vars for managed_node2 30583 1726853729.51624: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853729.51627: Calling groups_plugins_play to load vars for managed_node2 30583 1726853729.53316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853729.55342: done with get_vars() 30583 1726853729.55388: done getting variables 30583 1726853729.55451: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:35:29 -0400 (0:00:00.070) 0:01:04.892 ****** 30583 1726853729.55492: entering _queue_task() for managed_node2/debug 30583 1726853729.55876: worker is 1 (out of 1 available) 30583 1726853729.55889: exiting _queue_task() for managed_node2/debug 30583 1726853729.55901: done queuing things up, now waiting for results queue to drain 30583 1726853729.55903: waiting for pending results... 30583 1726853729.56289: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853729.56376: in run() - task 02083763-bbaf-05ea-abc5-00000000128e 30583 1726853729.56404: variable 'ansible_search_path' from source: unknown 30583 1726853729.56411: variable 'ansible_search_path' from source: unknown 30583 1726853729.56455: calling self._execute() 30583 1726853729.56576: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853729.56590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853729.56610: variable 'omit' from source: magic vars 30583 1726853729.57023: variable 'ansible_distribution_major_version' from source: facts 30583 1726853729.57064: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853729.57374: variable 'network_state' from source: role '' defaults 30583 1726853729.57378: Evaluated conditional (network_state != {}): False 30583 1726853729.57381: when evaluation is False, skipping this task 30583 1726853729.57384: _execute() done 30583 1726853729.57386: dumping result to json 30583 1726853729.57388: done dumping result, returning 30583 1726853729.57404: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-00000000128e] 30583 1726853729.57492: sending task result for task 02083763-bbaf-05ea-abc5-00000000128e skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853729.57766: no more pending results, returning what we have 30583 1726853729.57773: results queue empty 30583 1726853729.57774: checking for any_errors_fatal 30583 1726853729.57786: done checking for any_errors_fatal 30583 1726853729.57787: checking for max_fail_percentage 30583 1726853729.57789: done checking for max_fail_percentage 30583 1726853729.57790: checking to see if all hosts have failed and the running result is not ok 30583 1726853729.57791: done checking to see if all hosts have failed 30583 1726853729.57793: getting the remaining hosts for this loop 30583 1726853729.57795: done getting the remaining hosts for this loop 30583 1726853729.57799: getting the next task for host managed_node2 30583 1726853729.57809: done getting next task for host managed_node2 30583 1726853729.57813: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853729.57819: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853729.57846: getting variables 30583 1726853729.57848: in VariableManager get_vars() 30583 1726853729.58196: Calling all_inventory to load vars for managed_node2 30583 1726853729.58200: Calling groups_inventory to load vars for managed_node2 30583 1726853729.58202: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853729.58211: Calling all_plugins_play to load vars for managed_node2 30583 1726853729.58213: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853729.58216: Calling groups_plugins_play to load vars for managed_node2 30583 1726853729.58993: done sending task result for task 02083763-bbaf-05ea-abc5-00000000128e 30583 1726853729.58997: WORKER PROCESS EXITING 30583 1726853729.61433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853729.64114: done with get_vars() 30583 1726853729.64145: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:35:29 -0400 (0:00:00.087) 0:01:04.979 ****** 30583 1726853729.64253: entering _queue_task() for managed_node2/ping 30583 1726853729.64626: worker is 1 (out of 1 available) 30583 1726853729.64639: exiting _queue_task() for managed_node2/ping 30583 1726853729.64651: done queuing things up, now waiting for results queue to drain 30583 1726853729.64652: waiting for pending results... 30583 1726853729.65256: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853729.65546: in run() - task 02083763-bbaf-05ea-abc5-00000000128f 30583 1726853729.65683: variable 'ansible_search_path' from source: unknown 30583 1726853729.65693: variable 'ansible_search_path' from source: unknown 30583 1726853729.65732: calling self._execute() 30583 1726853729.65945: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853729.65985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853729.66001: variable 'omit' from source: magic vars 30583 1726853729.66804: variable 'ansible_distribution_major_version' from source: facts 30583 1726853729.66867: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853729.66881: variable 'omit' from source: magic vars 30583 1726853729.67072: variable 'omit' from source: magic vars 30583 1726853729.67112: variable 'omit' from source: magic vars 30583 1726853729.67159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853729.67217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853729.67302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853729.67361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853729.67406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853729.67612: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853729.67616: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853729.67618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853729.67716: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853729.67940: Set connection var ansible_timeout to 10 30583 1726853729.67943: Set connection var ansible_connection to ssh 30583 1726853729.67946: Set connection var ansible_shell_executable to /bin/sh 30583 1726853729.67948: Set connection var ansible_shell_type to sh 30583 1726853729.67950: Set connection var ansible_pipelining to False 30583 1726853729.67951: variable 'ansible_shell_executable' from source: unknown 30583 1726853729.67953: variable 'ansible_connection' from source: unknown 30583 1726853729.67955: variable 'ansible_module_compression' from source: unknown 30583 1726853729.67957: variable 'ansible_shell_type' from source: unknown 30583 1726853729.67958: variable 'ansible_shell_executable' from source: unknown 30583 1726853729.67960: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853729.67962: variable 'ansible_pipelining' from source: unknown 30583 1726853729.67964: variable 'ansible_timeout' from source: unknown 30583 1726853729.67965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853729.68480: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853729.68485: variable 'omit' from source: magic vars 30583 1726853729.68488: starting attempt loop 30583 1726853729.68490: running the handler 30583 1726853729.68493: _low_level_execute_command(): starting 30583 1726853729.68496: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853729.69994: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853729.70111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853729.70119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853729.70229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853729.70330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853729.72102: stdout chunk (state=3): >>>/root <<< 30583 1726853729.72229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853729.72266: stderr chunk (state=3): >>><<< 30583 1726853729.72476: stdout chunk (state=3): >>><<< 30583 1726853729.72480: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853729.72483: _low_level_execute_command(): starting 30583 1726853729.72486: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063 `" && echo ansible-tmp-1726853729.723626-33670-110039360972063="` echo /root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063 `" ) && sleep 0' 30583 1726853729.73606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853729.73622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853729.73639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853729.73865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853729.73904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853729.73930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853729.74230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853729.76496: stdout chunk (state=3): >>>ansible-tmp-1726853729.723626-33670-110039360972063=/root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063 <<< 30583 1726853729.76678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853729.76681: stdout chunk (state=3): >>><<< 30583 1726853729.76684: stderr chunk (state=3): >>><<< 30583 1726853729.76686: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853729.723626-33670-110039360972063=/root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853729.76689: variable 'ansible_module_compression' from source: unknown 30583 1726853729.76784: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853729.77277: variable 'ansible_facts' from source: unknown 30583 1726853729.77281: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063/AnsiballZ_ping.py 30583 1726853729.77633: Sending initial data 30583 1726853729.77637: Sent initial data (152 bytes) 30583 1726853729.79018: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853729.79027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853729.79081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853729.79095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853729.79109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853729.79290: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853729.79355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853729.79430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853729.81148: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853729.81208: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853729.81299: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpra09nrq6 /root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063/AnsiballZ_ping.py <<< 30583 1726853729.81328: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063/AnsiballZ_ping.py" <<< 30583 1726853729.81626: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpra09nrq6" to remote "/root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063/AnsiballZ_ping.py" <<< 30583 1726853729.82855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853729.83039: stderr chunk (state=3): >>><<< 30583 1726853729.83043: stdout chunk (state=3): >>><<< 30583 1726853729.83073: done transferring module to remote 30583 1726853729.83084: _low_level_execute_command(): starting 30583 1726853729.83089: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063/ /root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063/AnsiballZ_ping.py && sleep 0' 30583 1726853729.84205: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853729.84239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853729.84245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853729.84276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853729.84280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853729.84382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853729.84385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853729.84452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853729.86364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853729.86719: stderr chunk (state=3): >>><<< 30583 1726853729.86723: stdout chunk (state=3): >>><<< 30583 1726853729.86725: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853729.86727: _low_level_execute_command(): starting 30583 1726853729.86730: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063/AnsiballZ_ping.py && sleep 0' 30583 1726853729.87893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853729.88088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853729.88098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853729.88204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853730.03574: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853730.05005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853730.05009: stdout chunk (state=3): >>><<< 30583 1726853730.05012: stderr chunk (state=3): >>><<< 30583 1726853730.05088: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853730.05100: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853730.05147: _low_level_execute_command(): starting 30583 1726853730.05158: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853729.723626-33670-110039360972063/ > /dev/null 2>&1 && sleep 0' 30583 1726853730.06469: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853730.06488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853730.06504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853730.06532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853730.06575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853730.06592: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853730.06632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853730.06654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853730.06766: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853730.06970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853730.07304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853730.09133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853730.09143: stdout chunk (state=3): >>><<< 30583 1726853730.09154: stderr chunk (state=3): >>><<< 30583 1726853730.09178: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853730.09194: handler run complete 30583 1726853730.09215: attempt loop complete, returning result 30583 1726853730.09222: _execute() done 30583 1726853730.09228: dumping result to json 30583 1726853730.09235: done dumping result, returning 30583 1726853730.09248: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-00000000128f] 30583 1726853730.09256: sending task result for task 02083763-bbaf-05ea-abc5-00000000128f ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853730.09448: no more pending results, returning what we have 30583 1726853730.09452: results queue empty 30583 1726853730.09453: checking for any_errors_fatal 30583 1726853730.09463: done checking for any_errors_fatal 30583 1726853730.09464: checking for max_fail_percentage 30583 1726853730.09466: done checking for max_fail_percentage 30583 1726853730.09467: checking to see if all hosts have failed and the running result is not ok 30583 1726853730.09467: done checking to see if all hosts have failed 30583 1726853730.09468: getting the remaining hosts for this loop 30583 1726853730.09470: done getting the remaining hosts for this loop 30583 1726853730.09477: getting the next task for host managed_node2 30583 1726853730.09573: done getting next task for host managed_node2 30583 1726853730.09577: ^ task is: TASK: meta (role_complete) 30583 1726853730.09582: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853730.09601: getting variables 30583 1726853730.09604: in VariableManager get_vars() 30583 1726853730.09641: Calling all_inventory to load vars for managed_node2 30583 1726853730.09643: Calling groups_inventory to load vars for managed_node2 30583 1726853730.09646: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853730.09655: Calling all_plugins_play to load vars for managed_node2 30583 1726853730.09659: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853730.09663: Calling groups_plugins_play to load vars for managed_node2 30583 1726853730.10293: done sending task result for task 02083763-bbaf-05ea-abc5-00000000128f 30583 1726853730.10297: WORKER PROCESS EXITING 30583 1726853730.12083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853730.13969: done with get_vars() 30583 1726853730.13995: done getting variables 30583 1726853730.14101: done queuing things up, now waiting for results queue to drain 30583 1726853730.14103: results queue empty 30583 1726853730.14104: checking for any_errors_fatal 30583 1726853730.14107: done checking for any_errors_fatal 30583 1726853730.14108: checking for max_fail_percentage 30583 1726853730.14109: done checking for max_fail_percentage 30583 1726853730.14109: checking to see if all hosts have failed and the running result is not ok 30583 1726853730.14110: done checking to see if all hosts have failed 30583 1726853730.14111: getting the remaining hosts for this loop 30583 1726853730.14111: done getting the remaining hosts for this loop 30583 1726853730.14114: getting the next task for host managed_node2 30583 1726853730.14119: done getting next task for host managed_node2 30583 1726853730.14127: ^ task is: TASK: Test 30583 1726853730.14129: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853730.14132: getting variables 30583 1726853730.14133: in VariableManager get_vars() 30583 1726853730.14144: Calling all_inventory to load vars for managed_node2 30583 1726853730.14146: Calling groups_inventory to load vars for managed_node2 30583 1726853730.14148: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853730.14161: Calling all_plugins_play to load vars for managed_node2 30583 1726853730.14164: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853730.14167: Calling groups_plugins_play to load vars for managed_node2 30583 1726853730.14853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853730.15699: done with get_vars() 30583 1726853730.15714: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 13:35:30 -0400 (0:00:00.515) 0:01:05.494 ****** 30583 1726853730.15766: entering _queue_task() for managed_node2/include_tasks 30583 1726853730.16061: worker is 1 (out of 1 available) 30583 1726853730.16075: exiting _queue_task() for managed_node2/include_tasks 30583 1726853730.16090: done queuing things up, now waiting for results queue to drain 30583 1726853730.16091: waiting for pending results... 30583 1726853730.16387: running TaskExecutor() for managed_node2/TASK: Test 30583 1726853730.16520: in run() - task 02083763-bbaf-05ea-abc5-000000001009 30583 1726853730.16540: variable 'ansible_search_path' from source: unknown 30583 1726853730.16548: variable 'ansible_search_path' from source: unknown 30583 1726853730.16602: variable 'lsr_test' from source: include params 30583 1726853730.16821: variable 'lsr_test' from source: include params 30583 1726853730.16899: variable 'omit' from source: magic vars 30583 1726853730.17065: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853730.17110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853730.17135: variable 'omit' from source: magic vars 30583 1726853730.17353: variable 'ansible_distribution_major_version' from source: facts 30583 1726853730.17357: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853730.17366: variable 'item' from source: unknown 30583 1726853730.17414: variable 'item' from source: unknown 30583 1726853730.17436: variable 'item' from source: unknown 30583 1726853730.17484: variable 'item' from source: unknown 30583 1726853730.17619: dumping result to json 30583 1726853730.17622: done dumping result, returning 30583 1726853730.17624: done running TaskExecutor() for managed_node2/TASK: Test [02083763-bbaf-05ea-abc5-000000001009] 30583 1726853730.17626: sending task result for task 02083763-bbaf-05ea-abc5-000000001009 30583 1726853730.17659: done sending task result for task 02083763-bbaf-05ea-abc5-000000001009 30583 1726853730.17662: WORKER PROCESS EXITING 30583 1726853730.17688: no more pending results, returning what we have 30583 1726853730.17694: in VariableManager get_vars() 30583 1726853730.17733: Calling all_inventory to load vars for managed_node2 30583 1726853730.17736: Calling groups_inventory to load vars for managed_node2 30583 1726853730.17739: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853730.17752: Calling all_plugins_play to load vars for managed_node2 30583 1726853730.17755: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853730.17758: Calling groups_plugins_play to load vars for managed_node2 30583 1726853730.18667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853730.19892: done with get_vars() 30583 1726853730.19912: variable 'ansible_search_path' from source: unknown 30583 1726853730.19913: variable 'ansible_search_path' from source: unknown 30583 1726853730.19959: we have included files to process 30583 1726853730.19960: generating all_blocks data 30583 1726853730.19962: done generating all_blocks data 30583 1726853730.19968: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30583 1726853730.19969: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30583 1726853730.19973: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30583 1726853730.20156: done processing included file 30583 1726853730.20165: iterating over new_blocks loaded from include file 30583 1726853730.20167: in VariableManager get_vars() 30583 1726853730.20184: done with get_vars() 30583 1726853730.20186: filtering new block on tags 30583 1726853730.20212: done filtering new block on tags 30583 1726853730.20214: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed_node2 => (item=tasks/remove_profile.yml) 30583 1726853730.20219: extending task lists for all hosts with included blocks 30583 1726853730.21130: done extending task lists 30583 1726853730.21132: done processing included files 30583 1726853730.21140: results queue empty 30583 1726853730.21140: checking for any_errors_fatal 30583 1726853730.21142: done checking for any_errors_fatal 30583 1726853730.21143: checking for max_fail_percentage 30583 1726853730.21144: done checking for max_fail_percentage 30583 1726853730.21145: checking to see if all hosts have failed and the running result is not ok 30583 1726853730.21146: done checking to see if all hosts have failed 30583 1726853730.21146: getting the remaining hosts for this loop 30583 1726853730.21148: done getting the remaining hosts for this loop 30583 1726853730.21150: getting the next task for host managed_node2 30583 1726853730.21155: done getting next task for host managed_node2 30583 1726853730.21157: ^ task is: TASK: Include network role 30583 1726853730.21159: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853730.21162: getting variables 30583 1726853730.21163: in VariableManager get_vars() 30583 1726853730.21176: Calling all_inventory to load vars for managed_node2 30583 1726853730.21179: Calling groups_inventory to load vars for managed_node2 30583 1726853730.21181: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853730.21187: Calling all_plugins_play to load vars for managed_node2 30583 1726853730.21189: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853730.21191: Calling groups_plugins_play to load vars for managed_node2 30583 1726853730.22416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853730.24012: done with get_vars() 30583 1726853730.24036: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Friday 20 September 2024 13:35:30 -0400 (0:00:00.083) 0:01:05.578 ****** 30583 1726853730.24137: entering _queue_task() for managed_node2/include_role 30583 1726853730.24534: worker is 1 (out of 1 available) 30583 1726853730.24546: exiting _queue_task() for managed_node2/include_role 30583 1726853730.24559: done queuing things up, now waiting for results queue to drain 30583 1726853730.24560: waiting for pending results... 30583 1726853730.24990: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853730.25078: in run() - task 02083763-bbaf-05ea-abc5-0000000013e8 30583 1726853730.25082: variable 'ansible_search_path' from source: unknown 30583 1726853730.25086: variable 'ansible_search_path' from source: unknown 30583 1726853730.25088: calling self._execute() 30583 1726853730.25156: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853730.25163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853730.25182: variable 'omit' from source: magic vars 30583 1726853730.25579: variable 'ansible_distribution_major_version' from source: facts 30583 1726853730.25590: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853730.25596: _execute() done 30583 1726853730.25599: dumping result to json 30583 1726853730.25601: done dumping result, returning 30583 1726853730.25617: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-0000000013e8] 30583 1726853730.25623: sending task result for task 02083763-bbaf-05ea-abc5-0000000013e8 30583 1726853730.25758: no more pending results, returning what we have 30583 1726853730.25764: in VariableManager get_vars() 30583 1726853730.25808: Calling all_inventory to load vars for managed_node2 30583 1726853730.25811: Calling groups_inventory to load vars for managed_node2 30583 1726853730.25816: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853730.25831: Calling all_plugins_play to load vars for managed_node2 30583 1726853730.25835: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853730.25838: Calling groups_plugins_play to load vars for managed_node2 30583 1726853730.26483: done sending task result for task 02083763-bbaf-05ea-abc5-0000000013e8 30583 1726853730.26487: WORKER PROCESS EXITING 30583 1726853730.32405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853730.33939: done with get_vars() 30583 1726853730.33969: variable 'ansible_search_path' from source: unknown 30583 1726853730.33970: variable 'ansible_search_path' from source: unknown 30583 1726853730.34107: variable 'omit' from source: magic vars 30583 1726853730.34145: variable 'omit' from source: magic vars 30583 1726853730.34162: variable 'omit' from source: magic vars 30583 1726853730.34165: we have included files to process 30583 1726853730.34166: generating all_blocks data 30583 1726853730.34168: done generating all_blocks data 30583 1726853730.34169: processing included file: fedora.linux_system_roles.network 30583 1726853730.34190: in VariableManager get_vars() 30583 1726853730.34204: done with get_vars() 30583 1726853730.34227: in VariableManager get_vars() 30583 1726853730.34245: done with get_vars() 30583 1726853730.34282: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853730.34398: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853730.34478: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853730.34922: in VariableManager get_vars() 30583 1726853730.34941: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853730.36835: iterating over new_blocks loaded from include file 30583 1726853730.36837: in VariableManager get_vars() 30583 1726853730.36854: done with get_vars() 30583 1726853730.36856: filtering new block on tags 30583 1726853730.37135: done filtering new block on tags 30583 1726853730.37138: in VariableManager get_vars() 30583 1726853730.37153: done with get_vars() 30583 1726853730.37154: filtering new block on tags 30583 1726853730.37175: done filtering new block on tags 30583 1726853730.37177: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853730.37181: extending task lists for all hosts with included blocks 30583 1726853730.37292: done extending task lists 30583 1726853730.37294: done processing included files 30583 1726853730.37294: results queue empty 30583 1726853730.37295: checking for any_errors_fatal 30583 1726853730.37298: done checking for any_errors_fatal 30583 1726853730.37299: checking for max_fail_percentage 30583 1726853730.37300: done checking for max_fail_percentage 30583 1726853730.37301: checking to see if all hosts have failed and the running result is not ok 30583 1726853730.37302: done checking to see if all hosts have failed 30583 1726853730.37302: getting the remaining hosts for this loop 30583 1726853730.37303: done getting the remaining hosts for this loop 30583 1726853730.37306: getting the next task for host managed_node2 30583 1726853730.37310: done getting next task for host managed_node2 30583 1726853730.37312: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853730.37315: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853730.37325: getting variables 30583 1726853730.37326: in VariableManager get_vars() 30583 1726853730.37338: Calling all_inventory to load vars for managed_node2 30583 1726853730.37340: Calling groups_inventory to load vars for managed_node2 30583 1726853730.37342: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853730.37347: Calling all_plugins_play to load vars for managed_node2 30583 1726853730.37350: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853730.37352: Calling groups_plugins_play to load vars for managed_node2 30583 1726853730.38584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853730.40131: done with get_vars() 30583 1726853730.40154: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:35:30 -0400 (0:00:00.160) 0:01:05.739 ****** 30583 1726853730.40233: entering _queue_task() for managed_node2/include_tasks 30583 1726853730.40623: worker is 1 (out of 1 available) 30583 1726853730.40637: exiting _queue_task() for managed_node2/include_tasks 30583 1726853730.40650: done queuing things up, now waiting for results queue to drain 30583 1726853730.40651: waiting for pending results... 30583 1726853730.40846: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853730.40934: in run() - task 02083763-bbaf-05ea-abc5-00000000145f 30583 1726853730.40945: variable 'ansible_search_path' from source: unknown 30583 1726853730.40949: variable 'ansible_search_path' from source: unknown 30583 1726853730.40985: calling self._execute() 30583 1726853730.41063: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853730.41068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853730.41073: variable 'omit' from source: magic vars 30583 1726853730.41369: variable 'ansible_distribution_major_version' from source: facts 30583 1726853730.41380: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853730.41386: _execute() done 30583 1726853730.41389: dumping result to json 30583 1726853730.41392: done dumping result, returning 30583 1726853730.41399: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-00000000145f] 30583 1726853730.41404: sending task result for task 02083763-bbaf-05ea-abc5-00000000145f 30583 1726853730.41493: done sending task result for task 02083763-bbaf-05ea-abc5-00000000145f 30583 1726853730.41497: WORKER PROCESS EXITING 30583 1726853730.41579: no more pending results, returning what we have 30583 1726853730.41585: in VariableManager get_vars() 30583 1726853730.41627: Calling all_inventory to load vars for managed_node2 30583 1726853730.41630: Calling groups_inventory to load vars for managed_node2 30583 1726853730.41632: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853730.41641: Calling all_plugins_play to load vars for managed_node2 30583 1726853730.41643: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853730.41646: Calling groups_plugins_play to load vars for managed_node2 30583 1726853730.42591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853730.44082: done with get_vars() 30583 1726853730.44104: variable 'ansible_search_path' from source: unknown 30583 1726853730.44105: variable 'ansible_search_path' from source: unknown 30583 1726853730.44143: we have included files to process 30583 1726853730.44144: generating all_blocks data 30583 1726853730.44146: done generating all_blocks data 30583 1726853730.44149: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853730.44150: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853730.44152: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853730.44599: done processing included file 30583 1726853730.44600: iterating over new_blocks loaded from include file 30583 1726853730.44601: in VariableManager get_vars() 30583 1726853730.44617: done with get_vars() 30583 1726853730.44618: filtering new block on tags 30583 1726853730.44641: done filtering new block on tags 30583 1726853730.44644: in VariableManager get_vars() 30583 1726853730.44657: done with get_vars() 30583 1726853730.44660: filtering new block on tags 30583 1726853730.44688: done filtering new block on tags 30583 1726853730.44689: in VariableManager get_vars() 30583 1726853730.44703: done with get_vars() 30583 1726853730.44704: filtering new block on tags 30583 1726853730.44727: done filtering new block on tags 30583 1726853730.44729: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853730.44733: extending task lists for all hosts with included blocks 30583 1726853730.45728: done extending task lists 30583 1726853730.45729: done processing included files 30583 1726853730.45730: results queue empty 30583 1726853730.45730: checking for any_errors_fatal 30583 1726853730.45732: done checking for any_errors_fatal 30583 1726853730.45733: checking for max_fail_percentage 30583 1726853730.45734: done checking for max_fail_percentage 30583 1726853730.45734: checking to see if all hosts have failed and the running result is not ok 30583 1726853730.45735: done checking to see if all hosts have failed 30583 1726853730.45735: getting the remaining hosts for this loop 30583 1726853730.45736: done getting the remaining hosts for this loop 30583 1726853730.45738: getting the next task for host managed_node2 30583 1726853730.45741: done getting next task for host managed_node2 30583 1726853730.45743: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853730.45745: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853730.45753: getting variables 30583 1726853730.45753: in VariableManager get_vars() 30583 1726853730.45764: Calling all_inventory to load vars for managed_node2 30583 1726853730.45766: Calling groups_inventory to load vars for managed_node2 30583 1726853730.45767: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853730.45773: Calling all_plugins_play to load vars for managed_node2 30583 1726853730.45774: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853730.45776: Calling groups_plugins_play to load vars for managed_node2 30583 1726853730.46784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853730.47772: done with get_vars() 30583 1726853730.47788: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:35:30 -0400 (0:00:00.076) 0:01:05.815 ****** 30583 1726853730.47843: entering _queue_task() for managed_node2/setup 30583 1726853730.48118: worker is 1 (out of 1 available) 30583 1726853730.48131: exiting _queue_task() for managed_node2/setup 30583 1726853730.48143: done queuing things up, now waiting for results queue to drain 30583 1726853730.48145: waiting for pending results... 30583 1726853730.48332: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853730.48438: in run() - task 02083763-bbaf-05ea-abc5-0000000014b6 30583 1726853730.48450: variable 'ansible_search_path' from source: unknown 30583 1726853730.48454: variable 'ansible_search_path' from source: unknown 30583 1726853730.48486: calling self._execute() 30583 1726853730.48562: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853730.48566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853730.48574: variable 'omit' from source: magic vars 30583 1726853730.48855: variable 'ansible_distribution_major_version' from source: facts 30583 1726853730.48866: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853730.49032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853730.50494: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853730.50537: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853730.50565: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853730.50592: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853730.50611: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853730.50672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853730.50693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853730.50709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853730.50734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853730.50746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853730.50787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853730.50803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853730.50820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853730.50843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853730.50854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853730.50963: variable '__network_required_facts' from source: role '' defaults 30583 1726853730.50969: variable 'ansible_facts' from source: unknown 30583 1726853730.51417: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853730.51421: when evaluation is False, skipping this task 30583 1726853730.51424: _execute() done 30583 1726853730.51426: dumping result to json 30583 1726853730.51429: done dumping result, returning 30583 1726853730.51440: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-0000000014b6] 30583 1726853730.51443: sending task result for task 02083763-bbaf-05ea-abc5-0000000014b6 30583 1726853730.51526: done sending task result for task 02083763-bbaf-05ea-abc5-0000000014b6 30583 1726853730.51530: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853730.51585: no more pending results, returning what we have 30583 1726853730.51592: results queue empty 30583 1726853730.51593: checking for any_errors_fatal 30583 1726853730.51594: done checking for any_errors_fatal 30583 1726853730.51595: checking for max_fail_percentage 30583 1726853730.51597: done checking for max_fail_percentage 30583 1726853730.51597: checking to see if all hosts have failed and the running result is not ok 30583 1726853730.51598: done checking to see if all hosts have failed 30583 1726853730.51599: getting the remaining hosts for this loop 30583 1726853730.51601: done getting the remaining hosts for this loop 30583 1726853730.51604: getting the next task for host managed_node2 30583 1726853730.51616: done getting next task for host managed_node2 30583 1726853730.51619: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853730.51625: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853730.51649: getting variables 30583 1726853730.51651: in VariableManager get_vars() 30583 1726853730.51693: Calling all_inventory to load vars for managed_node2 30583 1726853730.51695: Calling groups_inventory to load vars for managed_node2 30583 1726853730.51697: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853730.51706: Calling all_plugins_play to load vars for managed_node2 30583 1726853730.51708: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853730.51716: Calling groups_plugins_play to load vars for managed_node2 30583 1726853730.52515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853730.53498: done with get_vars() 30583 1726853730.53515: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:35:30 -0400 (0:00:00.057) 0:01:05.873 ****** 30583 1726853730.53586: entering _queue_task() for managed_node2/stat 30583 1726853730.53827: worker is 1 (out of 1 available) 30583 1726853730.53842: exiting _queue_task() for managed_node2/stat 30583 1726853730.53856: done queuing things up, now waiting for results queue to drain 30583 1726853730.53860: waiting for pending results... 30583 1726853730.54048: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853730.54160: in run() - task 02083763-bbaf-05ea-abc5-0000000014b8 30583 1726853730.54177: variable 'ansible_search_path' from source: unknown 30583 1726853730.54181: variable 'ansible_search_path' from source: unknown 30583 1726853730.54212: calling self._execute() 30583 1726853730.54292: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853730.54298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853730.54311: variable 'omit' from source: magic vars 30583 1726853730.54600: variable 'ansible_distribution_major_version' from source: facts 30583 1726853730.54610: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853730.54729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853730.54932: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853730.54967: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853730.54994: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853730.55020: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853730.55089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853730.55107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853730.55125: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853730.55143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853730.55209: variable '__network_is_ostree' from source: set_fact 30583 1726853730.55214: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853730.55217: when evaluation is False, skipping this task 30583 1726853730.55220: _execute() done 30583 1726853730.55222: dumping result to json 30583 1726853730.55226: done dumping result, returning 30583 1726853730.55233: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-0000000014b8] 30583 1726853730.55239: sending task result for task 02083763-bbaf-05ea-abc5-0000000014b8 30583 1726853730.55317: done sending task result for task 02083763-bbaf-05ea-abc5-0000000014b8 30583 1726853730.55320: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853730.55367: no more pending results, returning what we have 30583 1726853730.55373: results queue empty 30583 1726853730.55374: checking for any_errors_fatal 30583 1726853730.55381: done checking for any_errors_fatal 30583 1726853730.55382: checking for max_fail_percentage 30583 1726853730.55384: done checking for max_fail_percentage 30583 1726853730.55385: checking to see if all hosts have failed and the running result is not ok 30583 1726853730.55386: done checking to see if all hosts have failed 30583 1726853730.55386: getting the remaining hosts for this loop 30583 1726853730.55388: done getting the remaining hosts for this loop 30583 1726853730.55392: getting the next task for host managed_node2 30583 1726853730.55400: done getting next task for host managed_node2 30583 1726853730.55403: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853730.55409: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853730.55430: getting variables 30583 1726853730.55431: in VariableManager get_vars() 30583 1726853730.55466: Calling all_inventory to load vars for managed_node2 30583 1726853730.55468: Calling groups_inventory to load vars for managed_node2 30583 1726853730.55470: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853730.55480: Calling all_plugins_play to load vars for managed_node2 30583 1726853730.55483: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853730.55485: Calling groups_plugins_play to load vars for managed_node2 30583 1726853730.56258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853730.57137: done with get_vars() 30583 1726853730.57152: done getting variables 30583 1726853730.57195: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:35:30 -0400 (0:00:00.036) 0:01:05.909 ****** 30583 1726853730.57223: entering _queue_task() for managed_node2/set_fact 30583 1726853730.57455: worker is 1 (out of 1 available) 30583 1726853730.57467: exiting _queue_task() for managed_node2/set_fact 30583 1726853730.57482: done queuing things up, now waiting for results queue to drain 30583 1726853730.57483: waiting for pending results... 30583 1726853730.57666: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853730.57777: in run() - task 02083763-bbaf-05ea-abc5-0000000014b9 30583 1726853730.57788: variable 'ansible_search_path' from source: unknown 30583 1726853730.57793: variable 'ansible_search_path' from source: unknown 30583 1726853730.57822: calling self._execute() 30583 1726853730.57900: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853730.57904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853730.57912: variable 'omit' from source: magic vars 30583 1726853730.58194: variable 'ansible_distribution_major_version' from source: facts 30583 1726853730.58204: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853730.58323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853730.58526: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853730.58557: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853730.58587: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853730.58613: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853730.58677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853730.58699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853730.58717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853730.58735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853730.58805: variable '__network_is_ostree' from source: set_fact 30583 1726853730.58811: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853730.58814: when evaluation is False, skipping this task 30583 1726853730.58817: _execute() done 30583 1726853730.58819: dumping result to json 30583 1726853730.58823: done dumping result, returning 30583 1726853730.58831: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-0000000014b9] 30583 1726853730.58834: sending task result for task 02083763-bbaf-05ea-abc5-0000000014b9 30583 1726853730.58912: done sending task result for task 02083763-bbaf-05ea-abc5-0000000014b9 30583 1726853730.58915: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853730.58957: no more pending results, returning what we have 30583 1726853730.58961: results queue empty 30583 1726853730.58962: checking for any_errors_fatal 30583 1726853730.58968: done checking for any_errors_fatal 30583 1726853730.58969: checking for max_fail_percentage 30583 1726853730.58972: done checking for max_fail_percentage 30583 1726853730.58973: checking to see if all hosts have failed and the running result is not ok 30583 1726853730.58974: done checking to see if all hosts have failed 30583 1726853730.58974: getting the remaining hosts for this loop 30583 1726853730.58976: done getting the remaining hosts for this loop 30583 1726853730.58980: getting the next task for host managed_node2 30583 1726853730.58991: done getting next task for host managed_node2 30583 1726853730.58995: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853730.59000: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853730.59022: getting variables 30583 1726853730.59023: in VariableManager get_vars() 30583 1726853730.59058: Calling all_inventory to load vars for managed_node2 30583 1726853730.59061: Calling groups_inventory to load vars for managed_node2 30583 1726853730.59063: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853730.59077: Calling all_plugins_play to load vars for managed_node2 30583 1726853730.59081: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853730.59084: Calling groups_plugins_play to load vars for managed_node2 30583 1726853730.60010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853730.60864: done with get_vars() 30583 1726853730.60881: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:35:30 -0400 (0:00:00.037) 0:01:05.946 ****** 30583 1726853730.60947: entering _queue_task() for managed_node2/service_facts 30583 1726853730.61177: worker is 1 (out of 1 available) 30583 1726853730.61190: exiting _queue_task() for managed_node2/service_facts 30583 1726853730.61204: done queuing things up, now waiting for results queue to drain 30583 1726853730.61205: waiting for pending results... 30583 1726853730.61391: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853730.61497: in run() - task 02083763-bbaf-05ea-abc5-0000000014bb 30583 1726853730.61508: variable 'ansible_search_path' from source: unknown 30583 1726853730.61512: variable 'ansible_search_path' from source: unknown 30583 1726853730.61541: calling self._execute() 30583 1726853730.61613: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853730.61617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853730.61625: variable 'omit' from source: magic vars 30583 1726853730.61912: variable 'ansible_distribution_major_version' from source: facts 30583 1726853730.61921: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853730.61926: variable 'omit' from source: magic vars 30583 1726853730.61984: variable 'omit' from source: magic vars 30583 1726853730.62007: variable 'omit' from source: magic vars 30583 1726853730.62038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853730.62068: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853730.62088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853730.62101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853730.62110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853730.62134: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853730.62137: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853730.62139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853730.62213: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853730.62218: Set connection var ansible_timeout to 10 30583 1726853730.62220: Set connection var ansible_connection to ssh 30583 1726853730.62226: Set connection var ansible_shell_executable to /bin/sh 30583 1726853730.62228: Set connection var ansible_shell_type to sh 30583 1726853730.62236: Set connection var ansible_pipelining to False 30583 1726853730.62256: variable 'ansible_shell_executable' from source: unknown 30583 1726853730.62259: variable 'ansible_connection' from source: unknown 30583 1726853730.62264: variable 'ansible_module_compression' from source: unknown 30583 1726853730.62266: variable 'ansible_shell_type' from source: unknown 30583 1726853730.62269: variable 'ansible_shell_executable' from source: unknown 30583 1726853730.62272: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853730.62277: variable 'ansible_pipelining' from source: unknown 30583 1726853730.62279: variable 'ansible_timeout' from source: unknown 30583 1726853730.62283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853730.62424: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853730.62433: variable 'omit' from source: magic vars 30583 1726853730.62439: starting attempt loop 30583 1726853730.62442: running the handler 30583 1726853730.62453: _low_level_execute_command(): starting 30583 1726853730.62459: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853730.62953: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853730.62991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853730.62994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853730.62997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853730.63046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853730.63049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853730.63051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853730.63136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853730.64918: stdout chunk (state=3): >>>/root <<< 30583 1726853730.65080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853730.65084: stdout chunk (state=3): >>><<< 30583 1726853730.65087: stderr chunk (state=3): >>><<< 30583 1726853730.65110: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853730.65177: _low_level_execute_command(): starting 30583 1726853730.65181: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129 `" && echo ansible-tmp-1726853730.6512492-33708-189915517634129="` echo /root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129 `" ) && sleep 0' 30583 1726853730.65781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853730.65796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853730.65810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853730.65842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853730.65859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853730.65952: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853730.65994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853730.66010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853730.66032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853730.66152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853730.68188: stdout chunk (state=3): >>>ansible-tmp-1726853730.6512492-33708-189915517634129=/root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129 <<< 30583 1726853730.68296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853730.68324: stderr chunk (state=3): >>><<< 30583 1726853730.68328: stdout chunk (state=3): >>><<< 30583 1726853730.68343: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853730.6512492-33708-189915517634129=/root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853730.68388: variable 'ansible_module_compression' from source: unknown 30583 1726853730.68426: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853730.68457: variable 'ansible_facts' from source: unknown 30583 1726853730.68520: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129/AnsiballZ_service_facts.py 30583 1726853730.68620: Sending initial data 30583 1726853730.68624: Sent initial data (162 bytes) 30583 1726853730.69079: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853730.69083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853730.69085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853730.69087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853730.69089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853730.69103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853730.69132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853730.69146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853730.69223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853730.70880: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853730.70885: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853730.70944: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853730.71025: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp0t3ryde_ /root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129/AnsiballZ_service_facts.py <<< 30583 1726853730.71029: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129/AnsiballZ_service_facts.py" <<< 30583 1726853730.71092: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp0t3ryde_" to remote "/root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129/AnsiballZ_service_facts.py" <<< 30583 1726853730.71095: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129/AnsiballZ_service_facts.py" <<< 30583 1726853730.71744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853730.71786: stderr chunk (state=3): >>><<< 30583 1726853730.71790: stdout chunk (state=3): >>><<< 30583 1726853730.71834: done transferring module to remote 30583 1726853730.71844: _low_level_execute_command(): starting 30583 1726853730.71848: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129/ /root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129/AnsiballZ_service_facts.py && sleep 0' 30583 1726853730.72302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853730.72305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853730.72307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853730.72313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853730.72315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853730.72363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853730.72367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853730.72444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853730.74335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853730.74360: stderr chunk (state=3): >>><<< 30583 1726853730.74363: stdout chunk (state=3): >>><<< 30583 1726853730.74383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853730.74387: _low_level_execute_command(): starting 30583 1726853730.74390: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129/AnsiballZ_service_facts.py && sleep 0' 30583 1726853730.74946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853730.74952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853730.74955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853730.74958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853730.74985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853730.75105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853732.37116: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30583 1726853732.37128: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30583 1726853732.37148: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30583 1726853732.37168: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 30583 1726853732.37187: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 30583 1726853732.37210: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853732.38944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853732.38973: stderr chunk (state=3): >>><<< 30583 1726853732.38976: stdout chunk (state=3): >>><<< 30583 1726853732.39008: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853732.39455: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853732.39468: _low_level_execute_command(): starting 30583 1726853732.39475: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853730.6512492-33708-189915517634129/ > /dev/null 2>&1 && sleep 0' 30583 1726853732.39914: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853732.39918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853732.39920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853732.39922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853732.39924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853732.39970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853732.39976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853732.40052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853732.41974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853732.42000: stderr chunk (state=3): >>><<< 30583 1726853732.42003: stdout chunk (state=3): >>><<< 30583 1726853732.42016: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853732.42022: handler run complete 30583 1726853732.42143: variable 'ansible_facts' from source: unknown 30583 1726853732.42247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853732.42529: variable 'ansible_facts' from source: unknown 30583 1726853732.42613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853732.42728: attempt loop complete, returning result 30583 1726853732.42731: _execute() done 30583 1726853732.42733: dumping result to json 30583 1726853732.42770: done dumping result, returning 30583 1726853732.42780: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-0000000014bb] 30583 1726853732.42785: sending task result for task 02083763-bbaf-05ea-abc5-0000000014bb ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853732.43416: done sending task result for task 02083763-bbaf-05ea-abc5-0000000014bb 30583 1726853732.43419: WORKER PROCESS EXITING 30583 1726853732.43433: no more pending results, returning what we have 30583 1726853732.43437: results queue empty 30583 1726853732.43438: checking for any_errors_fatal 30583 1726853732.43441: done checking for any_errors_fatal 30583 1726853732.43442: checking for max_fail_percentage 30583 1726853732.43444: done checking for max_fail_percentage 30583 1726853732.43444: checking to see if all hosts have failed and the running result is not ok 30583 1726853732.43445: done checking to see if all hosts have failed 30583 1726853732.43445: getting the remaining hosts for this loop 30583 1726853732.43446: done getting the remaining hosts for this loop 30583 1726853732.43448: getting the next task for host managed_node2 30583 1726853732.43453: done getting next task for host managed_node2 30583 1726853732.43455: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853732.43461: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853732.43468: getting variables 30583 1726853732.43469: in VariableManager get_vars() 30583 1726853732.43495: Calling all_inventory to load vars for managed_node2 30583 1726853732.43502: Calling groups_inventory to load vars for managed_node2 30583 1726853732.43504: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853732.43510: Calling all_plugins_play to load vars for managed_node2 30583 1726853732.43512: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853732.43517: Calling groups_plugins_play to load vars for managed_node2 30583 1726853732.44314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853732.45194: done with get_vars() 30583 1726853732.45210: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:35:32 -0400 (0:00:01.843) 0:01:07.790 ****** 30583 1726853732.45284: entering _queue_task() for managed_node2/package_facts 30583 1726853732.45524: worker is 1 (out of 1 available) 30583 1726853732.45539: exiting _queue_task() for managed_node2/package_facts 30583 1726853732.45553: done queuing things up, now waiting for results queue to drain 30583 1726853732.45555: waiting for pending results... 30583 1726853732.45742: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853732.45834: in run() - task 02083763-bbaf-05ea-abc5-0000000014bc 30583 1726853732.45846: variable 'ansible_search_path' from source: unknown 30583 1726853732.45851: variable 'ansible_search_path' from source: unknown 30583 1726853732.45882: calling self._execute() 30583 1726853732.45961: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853732.45965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853732.45974: variable 'omit' from source: magic vars 30583 1726853732.46262: variable 'ansible_distribution_major_version' from source: facts 30583 1726853732.46280: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853732.46286: variable 'omit' from source: magic vars 30583 1726853732.46339: variable 'omit' from source: magic vars 30583 1726853732.46364: variable 'omit' from source: magic vars 30583 1726853732.46398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853732.46425: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853732.46443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853732.46456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853732.46467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853732.46493: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853732.46496: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853732.46498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853732.46569: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853732.46576: Set connection var ansible_timeout to 10 30583 1726853732.46578: Set connection var ansible_connection to ssh 30583 1726853732.46583: Set connection var ansible_shell_executable to /bin/sh 30583 1726853732.46586: Set connection var ansible_shell_type to sh 30583 1726853732.46593: Set connection var ansible_pipelining to False 30583 1726853732.46611: variable 'ansible_shell_executable' from source: unknown 30583 1726853732.46614: variable 'ansible_connection' from source: unknown 30583 1726853732.46617: variable 'ansible_module_compression' from source: unknown 30583 1726853732.46620: variable 'ansible_shell_type' from source: unknown 30583 1726853732.46622: variable 'ansible_shell_executable' from source: unknown 30583 1726853732.46624: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853732.46626: variable 'ansible_pipelining' from source: unknown 30583 1726853732.46628: variable 'ansible_timeout' from source: unknown 30583 1726853732.46633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853732.46775: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853732.46786: variable 'omit' from source: magic vars 30583 1726853732.46791: starting attempt loop 30583 1726853732.46794: running the handler 30583 1726853732.46805: _low_level_execute_command(): starting 30583 1726853732.46812: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853732.47322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853732.47325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853732.47329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853732.47392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853732.47395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853732.47400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853732.47478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853732.49194: stdout chunk (state=3): >>>/root <<< 30583 1726853732.49297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853732.49325: stderr chunk (state=3): >>><<< 30583 1726853732.49328: stdout chunk (state=3): >>><<< 30583 1726853732.49346: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853732.49356: _low_level_execute_command(): starting 30583 1726853732.49364: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859 `" && echo ansible-tmp-1726853732.4934464-33782-139624842199859="` echo /root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859 `" ) && sleep 0' 30583 1726853732.49802: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853732.49805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853732.49807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853732.49819: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853732.49822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853732.49875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853732.49878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853732.49880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853732.49947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853732.51924: stdout chunk (state=3): >>>ansible-tmp-1726853732.4934464-33782-139624842199859=/root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859 <<< 30583 1726853732.52030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853732.52056: stderr chunk (state=3): >>><<< 30583 1726853732.52062: stdout chunk (state=3): >>><<< 30583 1726853732.52076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853732.4934464-33782-139624842199859=/root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853732.52118: variable 'ansible_module_compression' from source: unknown 30583 1726853732.52154: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853732.52209: variable 'ansible_facts' from source: unknown 30583 1726853732.52328: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859/AnsiballZ_package_facts.py 30583 1726853732.52426: Sending initial data 30583 1726853732.52430: Sent initial data (162 bytes) 30583 1726853732.52880: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853732.52883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853732.52885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853732.52889: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853732.52891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853732.52939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853732.52944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853732.52947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853732.53013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853732.54661: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853732.54665: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853732.54728: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853732.54795: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpt6b13abp /root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859/AnsiballZ_package_facts.py <<< 30583 1726853732.54802: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859/AnsiballZ_package_facts.py" <<< 30583 1726853732.54864: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpt6b13abp" to remote "/root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859/AnsiballZ_package_facts.py" <<< 30583 1726853732.54868: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859/AnsiballZ_package_facts.py" <<< 30583 1726853732.56356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853732.56396: stderr chunk (state=3): >>><<< 30583 1726853732.56399: stdout chunk (state=3): >>><<< 30583 1726853732.56439: done transferring module to remote 30583 1726853732.56448: _low_level_execute_command(): starting 30583 1726853732.56452: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859/ /root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859/AnsiballZ_package_facts.py && sleep 0' 30583 1726853732.56861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853732.56883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853732.56890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853732.56901: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853732.56946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853732.56949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853732.57024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853732.58955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853732.58961: stdout chunk (state=3): >>><<< 30583 1726853732.58964: stderr chunk (state=3): >>><<< 30583 1726853732.58984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853732.59067: _low_level_execute_command(): starting 30583 1726853732.59070: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859/AnsiballZ_package_facts.py && sleep 0' 30583 1726853732.59690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853732.59713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853732.59727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853732.59744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853732.59853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853733.04587: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30583 1726853733.04821: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "<<< 30583 1726853733.04850: stdout chunk (state=3): >>>source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853733.06483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853733.06645: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853733.06649: stdout chunk (state=3): >>><<< 30583 1726853733.06651: stderr chunk (state=3): >>><<< 30583 1726853733.06689: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853733.10374: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853733.10455: _low_level_execute_command(): starting 30583 1726853733.10461: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853732.4934464-33782-139624842199859/ > /dev/null 2>&1 && sleep 0' 30583 1726853733.11087: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853733.11100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853733.11122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853733.11233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853733.11474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853733.11538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853733.13508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853733.13512: stdout chunk (state=3): >>><<< 30583 1726853733.13518: stderr chunk (state=3): >>><<< 30583 1726853733.13549: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853733.13555: handler run complete 30583 1726853733.14677: variable 'ansible_facts' from source: unknown 30583 1726853733.15202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853733.17248: variable 'ansible_facts' from source: unknown 30583 1726853733.17797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853733.18606: attempt loop complete, returning result 30583 1726853733.18609: _execute() done 30583 1726853733.18611: dumping result to json 30583 1726853733.19099: done dumping result, returning 30583 1726853733.19109: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-0000000014bc] 30583 1726853733.19112: sending task result for task 02083763-bbaf-05ea-abc5-0000000014bc 30583 1726853733.22519: done sending task result for task 02083763-bbaf-05ea-abc5-0000000014bc 30583 1726853733.22524: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853733.22689: no more pending results, returning what we have 30583 1726853733.22693: results queue empty 30583 1726853733.22694: checking for any_errors_fatal 30583 1726853733.22700: done checking for any_errors_fatal 30583 1726853733.22700: checking for max_fail_percentage 30583 1726853733.22702: done checking for max_fail_percentage 30583 1726853733.22703: checking to see if all hosts have failed and the running result is not ok 30583 1726853733.22704: done checking to see if all hosts have failed 30583 1726853733.22704: getting the remaining hosts for this loop 30583 1726853733.22706: done getting the remaining hosts for this loop 30583 1726853733.22709: getting the next task for host managed_node2 30583 1726853733.22716: done getting next task for host managed_node2 30583 1726853733.22720: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853733.22725: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853733.22741: getting variables 30583 1726853733.22742: in VariableManager get_vars() 30583 1726853733.22775: Calling all_inventory to load vars for managed_node2 30583 1726853733.22778: Calling groups_inventory to load vars for managed_node2 30583 1726853733.22780: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853733.22789: Calling all_plugins_play to load vars for managed_node2 30583 1726853733.22792: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853733.22795: Calling groups_plugins_play to load vars for managed_node2 30583 1726853733.24306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853733.26114: done with get_vars() 30583 1726853733.26143: done getting variables 30583 1726853733.26216: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:33 -0400 (0:00:00.809) 0:01:08.599 ****** 30583 1726853733.26263: entering _queue_task() for managed_node2/debug 30583 1726853733.26652: worker is 1 (out of 1 available) 30583 1726853733.26668: exiting _queue_task() for managed_node2/debug 30583 1726853733.26684: done queuing things up, now waiting for results queue to drain 30583 1726853733.26686: waiting for pending results... 30583 1726853733.27101: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853733.27247: in run() - task 02083763-bbaf-05ea-abc5-000000001460 30583 1726853733.27267: variable 'ansible_search_path' from source: unknown 30583 1726853733.27338: variable 'ansible_search_path' from source: unknown 30583 1726853733.27341: calling self._execute() 30583 1726853733.27416: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853733.27428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853733.27458: variable 'omit' from source: magic vars 30583 1726853733.27838: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.27854: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853733.27865: variable 'omit' from source: magic vars 30583 1726853733.27929: variable 'omit' from source: magic vars 30583 1726853733.28028: variable 'network_provider' from source: set_fact 30583 1726853733.28051: variable 'omit' from source: magic vars 30583 1726853733.28176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853733.28180: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853733.28182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853733.28184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853733.28191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853733.28229: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853733.28241: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853733.28250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853733.28353: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853733.28365: Set connection var ansible_timeout to 10 30583 1726853733.28374: Set connection var ansible_connection to ssh 30583 1726853733.28384: Set connection var ansible_shell_executable to /bin/sh 30583 1726853733.28390: Set connection var ansible_shell_type to sh 30583 1726853733.28402: Set connection var ansible_pipelining to False 30583 1726853733.28434: variable 'ansible_shell_executable' from source: unknown 30583 1726853733.28445: variable 'ansible_connection' from source: unknown 30583 1726853733.28453: variable 'ansible_module_compression' from source: unknown 30583 1726853733.28460: variable 'ansible_shell_type' from source: unknown 30583 1726853733.28532: variable 'ansible_shell_executable' from source: unknown 30583 1726853733.28537: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853733.28539: variable 'ansible_pipelining' from source: unknown 30583 1726853733.28541: variable 'ansible_timeout' from source: unknown 30583 1726853733.28543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853733.28637: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853733.28658: variable 'omit' from source: magic vars 30583 1726853733.28668: starting attempt loop 30583 1726853733.28676: running the handler 30583 1726853733.28724: handler run complete 30583 1726853733.28746: attempt loop complete, returning result 30583 1726853733.28759: _execute() done 30583 1726853733.28766: dumping result to json 30583 1726853733.28856: done dumping result, returning 30583 1726853733.28864: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-000000001460] 30583 1726853733.28867: sending task result for task 02083763-bbaf-05ea-abc5-000000001460 30583 1726853733.28935: done sending task result for task 02083763-bbaf-05ea-abc5-000000001460 30583 1726853733.28938: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853733.29038: no more pending results, returning what we have 30583 1726853733.29041: results queue empty 30583 1726853733.29043: checking for any_errors_fatal 30583 1726853733.29050: done checking for any_errors_fatal 30583 1726853733.29051: checking for max_fail_percentage 30583 1726853733.29053: done checking for max_fail_percentage 30583 1726853733.29054: checking to see if all hosts have failed and the running result is not ok 30583 1726853733.29055: done checking to see if all hosts have failed 30583 1726853733.29056: getting the remaining hosts for this loop 30583 1726853733.29058: done getting the remaining hosts for this loop 30583 1726853733.29061: getting the next task for host managed_node2 30583 1726853733.29074: done getting next task for host managed_node2 30583 1726853733.29077: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853733.29084: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853733.29095: getting variables 30583 1726853733.29097: in VariableManager get_vars() 30583 1726853733.29135: Calling all_inventory to load vars for managed_node2 30583 1726853733.29138: Calling groups_inventory to load vars for managed_node2 30583 1726853733.29140: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853733.29150: Calling all_plugins_play to load vars for managed_node2 30583 1726853733.29154: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853733.29156: Calling groups_plugins_play to load vars for managed_node2 30583 1726853733.30709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853733.32492: done with get_vars() 30583 1726853733.32520: done getting variables 30583 1726853733.32582: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:33 -0400 (0:00:00.063) 0:01:08.663 ****** 30583 1726853733.32621: entering _queue_task() for managed_node2/fail 30583 1726853733.33176: worker is 1 (out of 1 available) 30583 1726853733.33186: exiting _queue_task() for managed_node2/fail 30583 1726853733.33197: done queuing things up, now waiting for results queue to drain 30583 1726853733.33198: waiting for pending results... 30583 1726853733.33332: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853733.33536: in run() - task 02083763-bbaf-05ea-abc5-000000001461 30583 1726853733.33540: variable 'ansible_search_path' from source: unknown 30583 1726853733.33544: variable 'ansible_search_path' from source: unknown 30583 1726853733.33546: calling self._execute() 30583 1726853733.33646: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853733.33659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853733.33675: variable 'omit' from source: magic vars 30583 1726853733.34055: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.34077: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853733.34209: variable 'network_state' from source: role '' defaults 30583 1726853733.34224: Evaluated conditional (network_state != {}): False 30583 1726853733.34231: when evaluation is False, skipping this task 30583 1726853733.34275: _execute() done 30583 1726853733.34278: dumping result to json 30583 1726853733.34281: done dumping result, returning 30583 1726853733.34284: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-000000001461] 30583 1726853733.34287: sending task result for task 02083763-bbaf-05ea-abc5-000000001461 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853733.34450: no more pending results, returning what we have 30583 1726853733.34454: results queue empty 30583 1726853733.34455: checking for any_errors_fatal 30583 1726853733.34462: done checking for any_errors_fatal 30583 1726853733.34462: checking for max_fail_percentage 30583 1726853733.34464: done checking for max_fail_percentage 30583 1726853733.34465: checking to see if all hosts have failed and the running result is not ok 30583 1726853733.34466: done checking to see if all hosts have failed 30583 1726853733.34467: getting the remaining hosts for this loop 30583 1726853733.34469: done getting the remaining hosts for this loop 30583 1726853733.34475: getting the next task for host managed_node2 30583 1726853733.34484: done getting next task for host managed_node2 30583 1726853733.34488: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853733.34494: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853733.34635: getting variables 30583 1726853733.34638: in VariableManager get_vars() 30583 1726853733.34682: Calling all_inventory to load vars for managed_node2 30583 1726853733.34686: Calling groups_inventory to load vars for managed_node2 30583 1726853733.34688: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853733.34729: done sending task result for task 02083763-bbaf-05ea-abc5-000000001461 30583 1726853733.34733: WORKER PROCESS EXITING 30583 1726853733.34742: Calling all_plugins_play to load vars for managed_node2 30583 1726853733.34746: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853733.34749: Calling groups_plugins_play to load vars for managed_node2 30583 1726853733.36347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853733.38261: done with get_vars() 30583 1726853733.38294: done getting variables 30583 1726853733.38362: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:33 -0400 (0:00:00.057) 0:01:08.721 ****** 30583 1726853733.38397: entering _queue_task() for managed_node2/fail 30583 1726853733.38747: worker is 1 (out of 1 available) 30583 1726853733.38761: exiting _queue_task() for managed_node2/fail 30583 1726853733.38775: done queuing things up, now waiting for results queue to drain 30583 1726853733.38777: waiting for pending results... 30583 1726853733.39050: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853733.39193: in run() - task 02083763-bbaf-05ea-abc5-000000001462 30583 1726853733.39212: variable 'ansible_search_path' from source: unknown 30583 1726853733.39220: variable 'ansible_search_path' from source: unknown 30583 1726853733.39262: calling self._execute() 30583 1726853733.39362: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853733.39376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853733.39390: variable 'omit' from source: magic vars 30583 1726853733.39750: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.39769: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853733.39893: variable 'network_state' from source: role '' defaults 30583 1726853733.39907: Evaluated conditional (network_state != {}): False 30583 1726853733.39915: when evaluation is False, skipping this task 30583 1726853733.39976: _execute() done 30583 1726853733.39979: dumping result to json 30583 1726853733.39981: done dumping result, returning 30583 1726853733.39985: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-000000001462] 30583 1726853733.39988: sending task result for task 02083763-bbaf-05ea-abc5-000000001462 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853733.40106: no more pending results, returning what we have 30583 1726853733.40110: results queue empty 30583 1726853733.40111: checking for any_errors_fatal 30583 1726853733.40120: done checking for any_errors_fatal 30583 1726853733.40121: checking for max_fail_percentage 30583 1726853733.40123: done checking for max_fail_percentage 30583 1726853733.40124: checking to see if all hosts have failed and the running result is not ok 30583 1726853733.40124: done checking to see if all hosts have failed 30583 1726853733.40125: getting the remaining hosts for this loop 30583 1726853733.40127: done getting the remaining hosts for this loop 30583 1726853733.40131: getting the next task for host managed_node2 30583 1726853733.40138: done getting next task for host managed_node2 30583 1726853733.40142: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853733.40148: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853733.40176: getting variables 30583 1726853733.40178: in VariableManager get_vars() 30583 1726853733.40214: Calling all_inventory to load vars for managed_node2 30583 1726853733.40217: Calling groups_inventory to load vars for managed_node2 30583 1726853733.40219: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853733.40229: Calling all_plugins_play to load vars for managed_node2 30583 1726853733.40232: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853733.40235: Calling groups_plugins_play to load vars for managed_node2 30583 1726853733.40784: done sending task result for task 02083763-bbaf-05ea-abc5-000000001462 30583 1726853733.40787: WORKER PROCESS EXITING 30583 1726853733.41604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853733.43152: done with get_vars() 30583 1726853733.43183: done getting variables 30583 1726853733.43243: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:33 -0400 (0:00:00.048) 0:01:08.770 ****** 30583 1726853733.43283: entering _queue_task() for managed_node2/fail 30583 1726853733.43624: worker is 1 (out of 1 available) 30583 1726853733.43637: exiting _queue_task() for managed_node2/fail 30583 1726853733.43648: done queuing things up, now waiting for results queue to drain 30583 1726853733.43649: waiting for pending results... 30583 1726853733.43952: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853733.44109: in run() - task 02083763-bbaf-05ea-abc5-000000001463 30583 1726853733.44129: variable 'ansible_search_path' from source: unknown 30583 1726853733.44138: variable 'ansible_search_path' from source: unknown 30583 1726853733.44184: calling self._execute() 30583 1726853733.44286: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853733.44298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853733.44317: variable 'omit' from source: magic vars 30583 1726853733.44704: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.44721: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853733.44900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853733.47519: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853733.47554: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853733.47610: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853733.47653: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853733.47688: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853733.47840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.47844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.47847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.47885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.47903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.48009: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.48029: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853733.48155: variable 'ansible_distribution' from source: facts 30583 1726853733.48167: variable '__network_rh_distros' from source: role '' defaults 30583 1726853733.48187: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853733.48445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.48502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.48511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.48553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.48580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.48777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.48780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.48782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.48784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.48785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.48787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.48789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.48803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.48843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.48867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.49203: variable 'network_connections' from source: include params 30583 1726853733.49221: variable 'interface' from source: play vars 30583 1726853733.49294: variable 'interface' from source: play vars 30583 1726853733.49309: variable 'network_state' from source: role '' defaults 30583 1726853733.49394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853733.49584: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853733.49629: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853733.49673: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853733.49708: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853733.49756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853733.49793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853733.49831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.49877: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853733.49978: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853733.49981: when evaluation is False, skipping this task 30583 1726853733.49983: _execute() done 30583 1726853733.49986: dumping result to json 30583 1726853733.49988: done dumping result, returning 30583 1726853733.49991: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-000000001463] 30583 1726853733.49993: sending task result for task 02083763-bbaf-05ea-abc5-000000001463 30583 1726853733.50065: done sending task result for task 02083763-bbaf-05ea-abc5-000000001463 30583 1726853733.50069: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853733.50124: no more pending results, returning what we have 30583 1726853733.50128: results queue empty 30583 1726853733.50129: checking for any_errors_fatal 30583 1726853733.50136: done checking for any_errors_fatal 30583 1726853733.50137: checking for max_fail_percentage 30583 1726853733.50140: done checking for max_fail_percentage 30583 1726853733.50141: checking to see if all hosts have failed and the running result is not ok 30583 1726853733.50141: done checking to see if all hosts have failed 30583 1726853733.50142: getting the remaining hosts for this loop 30583 1726853733.50144: done getting the remaining hosts for this loop 30583 1726853733.50149: getting the next task for host managed_node2 30583 1726853733.50157: done getting next task for host managed_node2 30583 1726853733.50164: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853733.50169: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853733.50197: getting variables 30583 1726853733.50199: in VariableManager get_vars() 30583 1726853733.50244: Calling all_inventory to load vars for managed_node2 30583 1726853733.50247: Calling groups_inventory to load vars for managed_node2 30583 1726853733.50250: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853733.50264: Calling all_plugins_play to load vars for managed_node2 30583 1726853733.50267: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853733.50475: Calling groups_plugins_play to load vars for managed_node2 30583 1726853733.52055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853733.53668: done with get_vars() 30583 1726853733.53694: done getting variables 30583 1726853733.53752: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:33 -0400 (0:00:00.105) 0:01:08.875 ****** 30583 1726853733.53791: entering _queue_task() for managed_node2/dnf 30583 1726853733.54139: worker is 1 (out of 1 available) 30583 1726853733.54153: exiting _queue_task() for managed_node2/dnf 30583 1726853733.54170: done queuing things up, now waiting for results queue to drain 30583 1726853733.54375: waiting for pending results... 30583 1726853733.54700: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853733.54705: in run() - task 02083763-bbaf-05ea-abc5-000000001464 30583 1726853733.54708: variable 'ansible_search_path' from source: unknown 30583 1726853733.54712: variable 'ansible_search_path' from source: unknown 30583 1726853733.54733: calling self._execute() 30583 1726853733.54848: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853733.54865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853733.54884: variable 'omit' from source: magic vars 30583 1726853733.55285: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.55302: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853733.55503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853733.57741: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853733.57832: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853733.57890: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853733.57934: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853733.57979: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853733.58176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.58180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.58182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.58184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.58203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.58345: variable 'ansible_distribution' from source: facts 30583 1726853733.58357: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.58391: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853733.58524: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853733.58668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.58699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.58731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.58778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.58796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.58846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.58877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.58905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.58950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.58973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.59015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.59146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.59150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.59153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.59155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.59289: variable 'network_connections' from source: include params 30583 1726853733.59304: variable 'interface' from source: play vars 30583 1726853733.59366: variable 'interface' from source: play vars 30583 1726853733.59438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853733.59639: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853733.59689: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853733.59731: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853733.59768: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853733.59841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853733.59920: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853733.59932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.59947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853733.60002: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853733.60273: variable 'network_connections' from source: include params 30583 1726853733.60285: variable 'interface' from source: play vars 30583 1726853733.60574: variable 'interface' from source: play vars 30583 1726853733.60578: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853733.60581: when evaluation is False, skipping this task 30583 1726853733.60583: _execute() done 30583 1726853733.60585: dumping result to json 30583 1726853733.60587: done dumping result, returning 30583 1726853733.60590: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001464] 30583 1726853733.60592: sending task result for task 02083763-bbaf-05ea-abc5-000000001464 30583 1726853733.60674: done sending task result for task 02083763-bbaf-05ea-abc5-000000001464 30583 1726853733.60677: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853733.60734: no more pending results, returning what we have 30583 1726853733.60738: results queue empty 30583 1726853733.60739: checking for any_errors_fatal 30583 1726853733.60746: done checking for any_errors_fatal 30583 1726853733.60747: checking for max_fail_percentage 30583 1726853733.60749: done checking for max_fail_percentage 30583 1726853733.60750: checking to see if all hosts have failed and the running result is not ok 30583 1726853733.60751: done checking to see if all hosts have failed 30583 1726853733.60752: getting the remaining hosts for this loop 30583 1726853733.60754: done getting the remaining hosts for this loop 30583 1726853733.60762: getting the next task for host managed_node2 30583 1726853733.60772: done getting next task for host managed_node2 30583 1726853733.60777: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853733.60782: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853733.60807: getting variables 30583 1726853733.60810: in VariableManager get_vars() 30583 1726853733.60855: Calling all_inventory to load vars for managed_node2 30583 1726853733.60860: Calling groups_inventory to load vars for managed_node2 30583 1726853733.60863: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853733.61075: Calling all_plugins_play to load vars for managed_node2 30583 1726853733.61080: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853733.61083: Calling groups_plugins_play to load vars for managed_node2 30583 1726853733.62404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853733.64147: done with get_vars() 30583 1726853733.64173: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853733.64248: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:33 -0400 (0:00:00.104) 0:01:08.980 ****** 30583 1726853733.64285: entering _queue_task() for managed_node2/yum 30583 1726853733.64628: worker is 1 (out of 1 available) 30583 1726853733.64641: exiting _queue_task() for managed_node2/yum 30583 1726853733.64653: done queuing things up, now waiting for results queue to drain 30583 1726853733.64655: waiting for pending results... 30583 1726853733.64963: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853733.65124: in run() - task 02083763-bbaf-05ea-abc5-000000001465 30583 1726853733.65144: variable 'ansible_search_path' from source: unknown 30583 1726853733.65152: variable 'ansible_search_path' from source: unknown 30583 1726853733.65199: calling self._execute() 30583 1726853733.65301: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853733.65476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853733.65480: variable 'omit' from source: magic vars 30583 1726853733.65723: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.65740: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853733.65927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853733.68220: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853733.68309: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853733.68349: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853733.68393: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853733.68428: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853733.68501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.68535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.68564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.68606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.68621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.68725: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.68753: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853733.68765: when evaluation is False, skipping this task 30583 1726853733.68775: _execute() done 30583 1726853733.68784: dumping result to json 30583 1726853733.68791: done dumping result, returning 30583 1726853733.68847: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001465] 30583 1726853733.68851: sending task result for task 02083763-bbaf-05ea-abc5-000000001465 30583 1726853733.68935: done sending task result for task 02083763-bbaf-05ea-abc5-000000001465 30583 1726853733.68938: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853733.69011: no more pending results, returning what we have 30583 1726853733.69015: results queue empty 30583 1726853733.69016: checking for any_errors_fatal 30583 1726853733.69025: done checking for any_errors_fatal 30583 1726853733.69026: checking for max_fail_percentage 30583 1726853733.69029: done checking for max_fail_percentage 30583 1726853733.69030: checking to see if all hosts have failed and the running result is not ok 30583 1726853733.69031: done checking to see if all hosts have failed 30583 1726853733.69032: getting the remaining hosts for this loop 30583 1726853733.69035: done getting the remaining hosts for this loop 30583 1726853733.69040: getting the next task for host managed_node2 30583 1726853733.69049: done getting next task for host managed_node2 30583 1726853733.69053: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853733.69062: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853733.69192: getting variables 30583 1726853733.69194: in VariableManager get_vars() 30583 1726853733.69239: Calling all_inventory to load vars for managed_node2 30583 1726853733.69243: Calling groups_inventory to load vars for managed_node2 30583 1726853733.69246: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853733.69261: Calling all_plugins_play to load vars for managed_node2 30583 1726853733.69265: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853733.69268: Calling groups_plugins_play to load vars for managed_node2 30583 1726853733.70857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853733.72410: done with get_vars() 30583 1726853733.72434: done getting variables 30583 1726853733.72498: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:33 -0400 (0:00:00.082) 0:01:09.062 ****** 30583 1726853733.72533: entering _queue_task() for managed_node2/fail 30583 1726853733.72875: worker is 1 (out of 1 available) 30583 1726853733.72889: exiting _queue_task() for managed_node2/fail 30583 1726853733.72901: done queuing things up, now waiting for results queue to drain 30583 1726853733.72902: waiting for pending results... 30583 1726853733.73293: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853733.73361: in run() - task 02083763-bbaf-05ea-abc5-000000001466 30583 1726853733.73410: variable 'ansible_search_path' from source: unknown 30583 1726853733.73414: variable 'ansible_search_path' from source: unknown 30583 1726853733.73430: calling self._execute() 30583 1726853733.73536: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853733.73547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853733.73627: variable 'omit' from source: magic vars 30583 1726853733.73941: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.73965: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853733.74095: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853733.74299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853733.76904: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853733.76908: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853733.76911: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853733.76927: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853733.76955: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853733.77064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.77126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.77153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.77204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.77219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.77445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.77449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.77451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.77453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.77455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.77460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.77462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.77472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.77516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.77535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.77748: variable 'network_connections' from source: include params 30583 1726853733.77763: variable 'interface' from source: play vars 30583 1726853733.77878: variable 'interface' from source: play vars 30583 1726853733.77909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853733.78105: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853733.78142: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853733.78181: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853733.78209: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853733.78253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853733.78676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853733.78690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.78724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853733.78797: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853733.79247: variable 'network_connections' from source: include params 30583 1726853733.79251: variable 'interface' from source: play vars 30583 1726853733.79253: variable 'interface' from source: play vars 30583 1726853733.79276: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853733.79280: when evaluation is False, skipping this task 30583 1726853733.79282: _execute() done 30583 1726853733.79285: dumping result to json 30583 1726853733.79287: done dumping result, returning 30583 1726853733.79294: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001466] 30583 1726853733.79296: sending task result for task 02083763-bbaf-05ea-abc5-000000001466 30583 1726853733.79487: done sending task result for task 02083763-bbaf-05ea-abc5-000000001466 30583 1726853733.79490: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853733.79554: no more pending results, returning what we have 30583 1726853733.79557: results queue empty 30583 1726853733.79561: checking for any_errors_fatal 30583 1726853733.79566: done checking for any_errors_fatal 30583 1726853733.79567: checking for max_fail_percentage 30583 1726853733.79568: done checking for max_fail_percentage 30583 1726853733.79569: checking to see if all hosts have failed and the running result is not ok 30583 1726853733.79570: done checking to see if all hosts have failed 30583 1726853733.79573: getting the remaining hosts for this loop 30583 1726853733.79575: done getting the remaining hosts for this loop 30583 1726853733.79578: getting the next task for host managed_node2 30583 1726853733.79586: done getting next task for host managed_node2 30583 1726853733.79589: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853733.79594: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853733.79612: getting variables 30583 1726853733.79614: in VariableManager get_vars() 30583 1726853733.79649: Calling all_inventory to load vars for managed_node2 30583 1726853733.79651: Calling groups_inventory to load vars for managed_node2 30583 1726853733.79653: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853733.79663: Calling all_plugins_play to load vars for managed_node2 30583 1726853733.79666: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853733.79668: Calling groups_plugins_play to load vars for managed_node2 30583 1726853733.86179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853733.87116: done with get_vars() 30583 1726853733.87137: done getting variables 30583 1726853733.87174: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:33 -0400 (0:00:00.146) 0:01:09.209 ****** 30583 1726853733.87199: entering _queue_task() for managed_node2/package 30583 1726853733.87474: worker is 1 (out of 1 available) 30583 1726853733.87490: exiting _queue_task() for managed_node2/package 30583 1726853733.87503: done queuing things up, now waiting for results queue to drain 30583 1726853733.87504: waiting for pending results... 30583 1726853733.87697: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853733.87807: in run() - task 02083763-bbaf-05ea-abc5-000000001467 30583 1726853733.87817: variable 'ansible_search_path' from source: unknown 30583 1726853733.87821: variable 'ansible_search_path' from source: unknown 30583 1726853733.87858: calling self._execute() 30583 1726853733.87990: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853733.87994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853733.87997: variable 'omit' from source: magic vars 30583 1726853733.88477: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.88481: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853733.88639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853733.89023: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853733.89027: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853733.89030: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853733.89049: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853733.89282: variable 'network_packages' from source: role '' defaults 30583 1726853733.89286: variable '__network_provider_setup' from source: role '' defaults 30583 1726853733.89289: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853733.89338: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853733.89347: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853733.89409: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853733.89590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853733.91026: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853733.91074: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853733.91101: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853733.91127: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853733.91153: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853733.91212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.91235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.91253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.91283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.91294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.91324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.91343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.91360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.91388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.91398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.91544: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853733.91642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.91713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.91716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.91719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.91743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.91853: variable 'ansible_python' from source: facts 30583 1726853733.91857: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853733.91948: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853733.92043: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853733.92190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.92276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.92279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.92285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.92315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.92364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853733.92412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853733.92488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.92492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853733.92522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853733.92630: variable 'network_connections' from source: include params 30583 1726853733.92637: variable 'interface' from source: play vars 30583 1726853733.92708: variable 'interface' from source: play vars 30583 1726853733.92759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853733.92783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853733.92803: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853733.92824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853733.92865: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853733.93043: variable 'network_connections' from source: include params 30583 1726853733.93046: variable 'interface' from source: play vars 30583 1726853733.93125: variable 'interface' from source: play vars 30583 1726853733.93147: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853733.93205: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853733.93400: variable 'network_connections' from source: include params 30583 1726853733.93404: variable 'interface' from source: play vars 30583 1726853733.93449: variable 'interface' from source: play vars 30583 1726853733.93467: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853733.93523: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853733.93716: variable 'network_connections' from source: include params 30583 1726853733.93719: variable 'interface' from source: play vars 30583 1726853733.93765: variable 'interface' from source: play vars 30583 1726853733.93802: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853733.93846: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853733.93851: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853733.93897: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853733.94032: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853733.94330: variable 'network_connections' from source: include params 30583 1726853733.94333: variable 'interface' from source: play vars 30583 1726853733.94379: variable 'interface' from source: play vars 30583 1726853733.94385: variable 'ansible_distribution' from source: facts 30583 1726853733.94389: variable '__network_rh_distros' from source: role '' defaults 30583 1726853733.94395: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.94405: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853733.94511: variable 'ansible_distribution' from source: facts 30583 1726853733.94515: variable '__network_rh_distros' from source: role '' defaults 30583 1726853733.94518: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.94530: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853733.94638: variable 'ansible_distribution' from source: facts 30583 1726853733.94642: variable '__network_rh_distros' from source: role '' defaults 30583 1726853733.94646: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.94676: variable 'network_provider' from source: set_fact 30583 1726853733.94689: variable 'ansible_facts' from source: unknown 30583 1726853733.95125: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853733.95129: when evaluation is False, skipping this task 30583 1726853733.95131: _execute() done 30583 1726853733.95134: dumping result to json 30583 1726853733.95136: done dumping result, returning 30583 1726853733.95145: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-000000001467] 30583 1726853733.95148: sending task result for task 02083763-bbaf-05ea-abc5-000000001467 30583 1726853733.95232: done sending task result for task 02083763-bbaf-05ea-abc5-000000001467 30583 1726853733.95234: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853733.95297: no more pending results, returning what we have 30583 1726853733.95300: results queue empty 30583 1726853733.95301: checking for any_errors_fatal 30583 1726853733.95308: done checking for any_errors_fatal 30583 1726853733.95309: checking for max_fail_percentage 30583 1726853733.95311: done checking for max_fail_percentage 30583 1726853733.95312: checking to see if all hosts have failed and the running result is not ok 30583 1726853733.95312: done checking to see if all hosts have failed 30583 1726853733.95313: getting the remaining hosts for this loop 30583 1726853733.95315: done getting the remaining hosts for this loop 30583 1726853733.95318: getting the next task for host managed_node2 30583 1726853733.95327: done getting next task for host managed_node2 30583 1726853733.95331: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853733.95336: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853733.95360: getting variables 30583 1726853733.95362: in VariableManager get_vars() 30583 1726853733.95407: Calling all_inventory to load vars for managed_node2 30583 1726853733.95410: Calling groups_inventory to load vars for managed_node2 30583 1726853733.95412: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853733.95421: Calling all_plugins_play to load vars for managed_node2 30583 1726853733.95424: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853733.95426: Calling groups_plugins_play to load vars for managed_node2 30583 1726853733.96221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853733.97103: done with get_vars() 30583 1726853733.97118: done getting variables 30583 1726853733.97160: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:33 -0400 (0:00:00.099) 0:01:09.309 ****** 30583 1726853733.97187: entering _queue_task() for managed_node2/package 30583 1726853733.97426: worker is 1 (out of 1 available) 30583 1726853733.97442: exiting _queue_task() for managed_node2/package 30583 1726853733.97455: done queuing things up, now waiting for results queue to drain 30583 1726853733.97457: waiting for pending results... 30583 1726853733.97656: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853733.97766: in run() - task 02083763-bbaf-05ea-abc5-000000001468 30583 1726853733.97777: variable 'ansible_search_path' from source: unknown 30583 1726853733.97781: variable 'ansible_search_path' from source: unknown 30583 1726853733.97816: calling self._execute() 30583 1726853733.97890: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853733.97898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853733.97909: variable 'omit' from source: magic vars 30583 1726853733.98193: variable 'ansible_distribution_major_version' from source: facts 30583 1726853733.98203: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853733.98288: variable 'network_state' from source: role '' defaults 30583 1726853733.98296: Evaluated conditional (network_state != {}): False 30583 1726853733.98299: when evaluation is False, skipping this task 30583 1726853733.98302: _execute() done 30583 1726853733.98305: dumping result to json 30583 1726853733.98307: done dumping result, returning 30583 1726853733.98315: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000001468] 30583 1726853733.98318: sending task result for task 02083763-bbaf-05ea-abc5-000000001468 30583 1726853733.98407: done sending task result for task 02083763-bbaf-05ea-abc5-000000001468 30583 1726853733.98410: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853733.98453: no more pending results, returning what we have 30583 1726853733.98457: results queue empty 30583 1726853733.98458: checking for any_errors_fatal 30583 1726853733.98463: done checking for any_errors_fatal 30583 1726853733.98464: checking for max_fail_percentage 30583 1726853733.98466: done checking for max_fail_percentage 30583 1726853733.98467: checking to see if all hosts have failed and the running result is not ok 30583 1726853733.98468: done checking to see if all hosts have failed 30583 1726853733.98468: getting the remaining hosts for this loop 30583 1726853733.98470: done getting the remaining hosts for this loop 30583 1726853733.98476: getting the next task for host managed_node2 30583 1726853733.98483: done getting next task for host managed_node2 30583 1726853733.98488: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853733.98493: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853733.98514: getting variables 30583 1726853733.98516: in VariableManager get_vars() 30583 1726853733.98548: Calling all_inventory to load vars for managed_node2 30583 1726853733.98550: Calling groups_inventory to load vars for managed_node2 30583 1726853733.98552: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853733.98561: Calling all_plugins_play to load vars for managed_node2 30583 1726853733.98564: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853733.98566: Calling groups_plugins_play to load vars for managed_node2 30583 1726853733.99430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853734.00291: done with get_vars() 30583 1726853734.00305: done getting variables 30583 1726853734.00345: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:34 -0400 (0:00:00.031) 0:01:09.340 ****** 30583 1726853734.00370: entering _queue_task() for managed_node2/package 30583 1726853734.00590: worker is 1 (out of 1 available) 30583 1726853734.00604: exiting _queue_task() for managed_node2/package 30583 1726853734.00618: done queuing things up, now waiting for results queue to drain 30583 1726853734.00619: waiting for pending results... 30583 1726853734.00806: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853734.00907: in run() - task 02083763-bbaf-05ea-abc5-000000001469 30583 1726853734.00919: variable 'ansible_search_path' from source: unknown 30583 1726853734.00923: variable 'ansible_search_path' from source: unknown 30583 1726853734.00954: calling self._execute() 30583 1726853734.01028: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853734.01032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853734.01042: variable 'omit' from source: magic vars 30583 1726853734.01326: variable 'ansible_distribution_major_version' from source: facts 30583 1726853734.01335: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853734.01423: variable 'network_state' from source: role '' defaults 30583 1726853734.01431: Evaluated conditional (network_state != {}): False 30583 1726853734.01434: when evaluation is False, skipping this task 30583 1726853734.01438: _execute() done 30583 1726853734.01440: dumping result to json 30583 1726853734.01442: done dumping result, returning 30583 1726853734.01450: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000001469] 30583 1726853734.01453: sending task result for task 02083763-bbaf-05ea-abc5-000000001469 30583 1726853734.01545: done sending task result for task 02083763-bbaf-05ea-abc5-000000001469 30583 1726853734.01548: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853734.01592: no more pending results, returning what we have 30583 1726853734.01597: results queue empty 30583 1726853734.01598: checking for any_errors_fatal 30583 1726853734.01605: done checking for any_errors_fatal 30583 1726853734.01606: checking for max_fail_percentage 30583 1726853734.01607: done checking for max_fail_percentage 30583 1726853734.01608: checking to see if all hosts have failed and the running result is not ok 30583 1726853734.01609: done checking to see if all hosts have failed 30583 1726853734.01610: getting the remaining hosts for this loop 30583 1726853734.01612: done getting the remaining hosts for this loop 30583 1726853734.01615: getting the next task for host managed_node2 30583 1726853734.01622: done getting next task for host managed_node2 30583 1726853734.01626: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853734.01631: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853734.01650: getting variables 30583 1726853734.01651: in VariableManager get_vars() 30583 1726853734.01689: Calling all_inventory to load vars for managed_node2 30583 1726853734.01692: Calling groups_inventory to load vars for managed_node2 30583 1726853734.01694: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853734.01701: Calling all_plugins_play to load vars for managed_node2 30583 1726853734.01704: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853734.01706: Calling groups_plugins_play to load vars for managed_node2 30583 1726853734.02452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853734.03321: done with get_vars() 30583 1726853734.03335: done getting variables 30583 1726853734.03378: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:34 -0400 (0:00:00.030) 0:01:09.371 ****** 30583 1726853734.03403: entering _queue_task() for managed_node2/service 30583 1726853734.03623: worker is 1 (out of 1 available) 30583 1726853734.03636: exiting _queue_task() for managed_node2/service 30583 1726853734.03650: done queuing things up, now waiting for results queue to drain 30583 1726853734.03651: waiting for pending results... 30583 1726853734.03834: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853734.03944: in run() - task 02083763-bbaf-05ea-abc5-00000000146a 30583 1726853734.03953: variable 'ansible_search_path' from source: unknown 30583 1726853734.03957: variable 'ansible_search_path' from source: unknown 30583 1726853734.04077: calling self._execute() 30583 1726853734.04081: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853734.04084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853734.04087: variable 'omit' from source: magic vars 30583 1726853734.04342: variable 'ansible_distribution_major_version' from source: facts 30583 1726853734.04352: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853734.04437: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853734.04572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853734.06274: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853734.06326: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853734.06352: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853734.06381: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853734.06403: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853734.06459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853734.06483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853734.06503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.06529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853734.06539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853734.06575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853734.06590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853734.06612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.06635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853734.06646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853734.06675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853734.06692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853734.06708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.06736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853734.06746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853734.06865: variable 'network_connections' from source: include params 30583 1726853734.06877: variable 'interface' from source: play vars 30583 1726853734.06923: variable 'interface' from source: play vars 30583 1726853734.06978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853734.07090: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853734.07117: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853734.07148: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853734.07174: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853734.07203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853734.07218: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853734.07234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.07251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853734.07295: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853734.07444: variable 'network_connections' from source: include params 30583 1726853734.07447: variable 'interface' from source: play vars 30583 1726853734.07496: variable 'interface' from source: play vars 30583 1726853734.07514: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853734.07517: when evaluation is False, skipping this task 30583 1726853734.07519: _execute() done 30583 1726853734.07522: dumping result to json 30583 1726853734.07524: done dumping result, returning 30583 1726853734.07532: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-00000000146a] 30583 1726853734.07535: sending task result for task 02083763-bbaf-05ea-abc5-00000000146a 30583 1726853734.07625: done sending task result for task 02083763-bbaf-05ea-abc5-00000000146a 30583 1726853734.07635: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853734.07682: no more pending results, returning what we have 30583 1726853734.07685: results queue empty 30583 1726853734.07686: checking for any_errors_fatal 30583 1726853734.07694: done checking for any_errors_fatal 30583 1726853734.07694: checking for max_fail_percentage 30583 1726853734.07696: done checking for max_fail_percentage 30583 1726853734.07697: checking to see if all hosts have failed and the running result is not ok 30583 1726853734.07698: done checking to see if all hosts have failed 30583 1726853734.07698: getting the remaining hosts for this loop 30583 1726853734.07700: done getting the remaining hosts for this loop 30583 1726853734.07703: getting the next task for host managed_node2 30583 1726853734.07711: done getting next task for host managed_node2 30583 1726853734.07715: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853734.07720: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853734.07742: getting variables 30583 1726853734.07743: in VariableManager get_vars() 30583 1726853734.07788: Calling all_inventory to load vars for managed_node2 30583 1726853734.07792: Calling groups_inventory to load vars for managed_node2 30583 1726853734.07794: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853734.07803: Calling all_plugins_play to load vars for managed_node2 30583 1726853734.07805: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853734.07807: Calling groups_plugins_play to load vars for managed_node2 30583 1726853734.08698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853734.09563: done with get_vars() 30583 1726853734.09579: done getting variables 30583 1726853734.09622: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:34 -0400 (0:00:00.062) 0:01:09.433 ****** 30583 1726853734.09645: entering _queue_task() for managed_node2/service 30583 1726853734.09874: worker is 1 (out of 1 available) 30583 1726853734.09890: exiting _queue_task() for managed_node2/service 30583 1726853734.09903: done queuing things up, now waiting for results queue to drain 30583 1726853734.09905: waiting for pending results... 30583 1726853734.10086: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853734.10199: in run() - task 02083763-bbaf-05ea-abc5-00000000146b 30583 1726853734.10209: variable 'ansible_search_path' from source: unknown 30583 1726853734.10213: variable 'ansible_search_path' from source: unknown 30583 1726853734.10247: calling self._execute() 30583 1726853734.10323: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853734.10327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853734.10336: variable 'omit' from source: magic vars 30583 1726853734.10620: variable 'ansible_distribution_major_version' from source: facts 30583 1726853734.10629: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853734.10744: variable 'network_provider' from source: set_fact 30583 1726853734.10748: variable 'network_state' from source: role '' defaults 30583 1726853734.10756: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853734.10762: variable 'omit' from source: magic vars 30583 1726853734.10826: variable 'omit' from source: magic vars 30583 1726853734.10850: variable 'network_service_name' from source: role '' defaults 30583 1726853734.10900: variable 'network_service_name' from source: role '' defaults 30583 1726853734.10967: variable '__network_provider_setup' from source: role '' defaults 30583 1726853734.10972: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853734.11020: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853734.11027: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853734.11072: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853734.11228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853734.12854: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853734.13076: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853734.13079: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853734.13082: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853734.13084: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853734.13119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853734.13155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853734.13191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.13251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853734.13255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853734.13294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853734.13310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853734.13327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.13356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853734.13375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853734.13532: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853734.13612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853734.13628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853734.13644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.13673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853734.13684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853734.13746: variable 'ansible_python' from source: facts 30583 1726853734.13758: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853734.13819: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853734.13875: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853734.13957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853734.13978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853734.13994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.14022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853734.14032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853734.14066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853734.14087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853734.14105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.14133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853734.14143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853734.14238: variable 'network_connections' from source: include params 30583 1726853734.14245: variable 'interface' from source: play vars 30583 1726853734.14300: variable 'interface' from source: play vars 30583 1726853734.14376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853734.14510: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853734.14546: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853734.14582: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853734.14612: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853734.14654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853734.14682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853734.14704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.14726: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853734.14768: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853734.14945: variable 'network_connections' from source: include params 30583 1726853734.14949: variable 'interface' from source: play vars 30583 1726853734.15007: variable 'interface' from source: play vars 30583 1726853734.15030: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853734.15275: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853734.15389: variable 'network_connections' from source: include params 30583 1726853734.15399: variable 'interface' from source: play vars 30583 1726853734.15473: variable 'interface' from source: play vars 30583 1726853734.15500: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853734.15584: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853734.15874: variable 'network_connections' from source: include params 30583 1726853734.15884: variable 'interface' from source: play vars 30583 1726853734.15948: variable 'interface' from source: play vars 30583 1726853734.16004: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853734.16064: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853734.16078: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853734.16137: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853734.16355: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853734.17050: variable 'network_connections' from source: include params 30583 1726853734.17064: variable 'interface' from source: play vars 30583 1726853734.17144: variable 'interface' from source: play vars 30583 1726853734.17156: variable 'ansible_distribution' from source: facts 30583 1726853734.17167: variable '__network_rh_distros' from source: role '' defaults 30583 1726853734.17180: variable 'ansible_distribution_major_version' from source: facts 30583 1726853734.17199: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853734.17391: variable 'ansible_distribution' from source: facts 30583 1726853734.17402: variable '__network_rh_distros' from source: role '' defaults 30583 1726853734.17412: variable 'ansible_distribution_major_version' from source: facts 30583 1726853734.17432: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853734.17660: variable 'ansible_distribution' from source: facts 30583 1726853734.17674: variable '__network_rh_distros' from source: role '' defaults 30583 1726853734.17685: variable 'ansible_distribution_major_version' from source: facts 30583 1726853734.17723: variable 'network_provider' from source: set_fact 30583 1726853734.17750: variable 'omit' from source: magic vars 30583 1726853734.17785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853734.17883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853734.17886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853734.17889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853734.17891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853734.17914: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853734.17922: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853734.17929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853734.18029: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853734.18041: Set connection var ansible_timeout to 10 30583 1726853734.18047: Set connection var ansible_connection to ssh 30583 1726853734.18070: Set connection var ansible_shell_executable to /bin/sh 30583 1726853734.18082: Set connection var ansible_shell_type to sh 30583 1726853734.18096: Set connection var ansible_pipelining to False 30583 1726853734.18144: variable 'ansible_shell_executable' from source: unknown 30583 1726853734.18189: variable 'ansible_connection' from source: unknown 30583 1726853734.18192: variable 'ansible_module_compression' from source: unknown 30583 1726853734.18194: variable 'ansible_shell_type' from source: unknown 30583 1726853734.18196: variable 'ansible_shell_executable' from source: unknown 30583 1726853734.18198: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853734.18200: variable 'ansible_pipelining' from source: unknown 30583 1726853734.18202: variable 'ansible_timeout' from source: unknown 30583 1726853734.18203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853734.18305: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853734.18476: variable 'omit' from source: magic vars 30583 1726853734.18479: starting attempt loop 30583 1726853734.18481: running the handler 30583 1726853734.18483: variable 'ansible_facts' from source: unknown 30583 1726853734.19188: _low_level_execute_command(): starting 30583 1726853734.19201: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853734.20032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853734.20098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853734.21838: stdout chunk (state=3): >>>/root <<< 30583 1726853734.21966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853734.21999: stderr chunk (state=3): >>><<< 30583 1726853734.22015: stdout chunk (state=3): >>><<< 30583 1726853734.22036: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853734.22052: _low_level_execute_command(): starting 30583 1726853734.22069: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522 `" && echo ansible-tmp-1726853734.2204237-33854-181909951377522="` echo /root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522 `" ) && sleep 0' 30583 1726853734.22702: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853734.22789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853734.22831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853734.22846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853734.22867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853734.22970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853734.24968: stdout chunk (state=3): >>>ansible-tmp-1726853734.2204237-33854-181909951377522=/root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522 <<< 30583 1726853734.25122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853734.25125: stdout chunk (state=3): >>><<< 30583 1726853734.25128: stderr chunk (state=3): >>><<< 30583 1726853734.25276: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853734.2204237-33854-181909951377522=/root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853734.25280: variable 'ansible_module_compression' from source: unknown 30583 1726853734.25282: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853734.25309: variable 'ansible_facts' from source: unknown 30583 1726853734.25566: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522/AnsiballZ_systemd.py 30583 1726853734.25692: Sending initial data 30583 1726853734.25695: Sent initial data (156 bytes) 30583 1726853734.26151: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853734.26157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853734.26162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853734.26164: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853734.26172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853734.26217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853734.26220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853734.26224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853734.26296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853734.28110: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853734.28192: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853734.28276: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpcdzjhkv1 /root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522/AnsiballZ_systemd.py <<< 30583 1726853734.28299: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522/AnsiballZ_systemd.py" <<< 30583 1726853734.28376: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpcdzjhkv1" to remote "/root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522/AnsiballZ_systemd.py" <<< 30583 1726853734.28414: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522/AnsiballZ_systemd.py" <<< 30583 1726853734.29636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853734.29755: stderr chunk (state=3): >>><<< 30583 1726853734.29762: stdout chunk (state=3): >>><<< 30583 1726853734.29765: done transferring module to remote 30583 1726853734.29767: _low_level_execute_command(): starting 30583 1726853734.29769: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522/ /root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522/AnsiballZ_systemd.py && sleep 0' 30583 1726853734.30143: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853734.30176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853734.30283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853734.30296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853734.30533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853734.32304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853734.32342: stderr chunk (state=3): >>><<< 30583 1726853734.32351: stdout chunk (state=3): >>><<< 30583 1726853734.32369: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853734.32380: _low_level_execute_command(): starting 30583 1726853734.32389: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522/AnsiballZ_systemd.py && sleep 0' 30583 1726853734.32961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853734.32979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853734.32993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853734.33010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853734.33025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853734.33036: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853734.33048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853734.33065: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853734.33084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853734.33167: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853734.33189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853734.33214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853734.33310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853734.63296: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4653056", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297427456", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1885381000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "syst<<< 30583 1726853734.63328: stdout chunk (state=3): >>>em.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853734.65321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853734.65680: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853734.65684: stdout chunk (state=3): >>><<< 30583 1726853734.65686: stderr chunk (state=3): >>><<< 30583 1726853734.65690: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4653056", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297427456", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1885381000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853734.65843: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853734.66211: _low_level_execute_command(): starting 30583 1726853734.66215: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853734.2204237-33854-181909951377522/ > /dev/null 2>&1 && sleep 0' 30583 1726853734.67283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853734.67302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853734.67385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853734.67629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853734.67689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853734.69750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853734.69753: stdout chunk (state=3): >>><<< 30583 1726853734.69756: stderr chunk (state=3): >>><<< 30583 1726853734.69977: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853734.69980: handler run complete 30583 1726853734.69982: attempt loop complete, returning result 30583 1726853734.70076: _execute() done 30583 1726853734.70080: dumping result to json 30583 1726853734.70082: done dumping result, returning 30583 1726853734.70084: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-00000000146b] 30583 1726853734.70086: sending task result for task 02083763-bbaf-05ea-abc5-00000000146b ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853734.70518: no more pending results, returning what we have 30583 1726853734.70521: results queue empty 30583 1726853734.70522: checking for any_errors_fatal 30583 1726853734.70526: done checking for any_errors_fatal 30583 1726853734.70527: checking for max_fail_percentage 30583 1726853734.70529: done checking for max_fail_percentage 30583 1726853734.70529: checking to see if all hosts have failed and the running result is not ok 30583 1726853734.70530: done checking to see if all hosts have failed 30583 1726853734.70531: getting the remaining hosts for this loop 30583 1726853734.70533: done getting the remaining hosts for this loop 30583 1726853734.70536: getting the next task for host managed_node2 30583 1726853734.70543: done getting next task for host managed_node2 30583 1726853734.70546: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853734.70551: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853734.70563: getting variables 30583 1726853734.70565: in VariableManager get_vars() 30583 1726853734.70602: Calling all_inventory to load vars for managed_node2 30583 1726853734.70605: Calling groups_inventory to load vars for managed_node2 30583 1726853734.70608: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853734.70620: Calling all_plugins_play to load vars for managed_node2 30583 1726853734.70623: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853734.70627: Calling groups_plugins_play to load vars for managed_node2 30583 1726853734.71788: done sending task result for task 02083763-bbaf-05ea-abc5-00000000146b 30583 1726853734.71793: WORKER PROCESS EXITING 30583 1726853734.73519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853734.76767: done with get_vars() 30583 1726853734.76953: done getting variables 30583 1726853734.77017: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:35:34 -0400 (0:00:00.674) 0:01:10.107 ****** 30583 1726853734.77066: entering _queue_task() for managed_node2/service 30583 1726853734.77545: worker is 1 (out of 1 available) 30583 1726853734.77562: exiting _queue_task() for managed_node2/service 30583 1726853734.77781: done queuing things up, now waiting for results queue to drain 30583 1726853734.77783: waiting for pending results... 30583 1726853734.77920: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853734.78105: in run() - task 02083763-bbaf-05ea-abc5-00000000146c 30583 1726853734.78131: variable 'ansible_search_path' from source: unknown 30583 1726853734.78139: variable 'ansible_search_path' from source: unknown 30583 1726853734.78191: calling self._execute() 30583 1726853734.78310: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853734.78332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853734.78347: variable 'omit' from source: magic vars 30583 1726853734.78787: variable 'ansible_distribution_major_version' from source: facts 30583 1726853734.78805: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853734.78934: variable 'network_provider' from source: set_fact 30583 1726853734.78947: Evaluated conditional (network_provider == "nm"): True 30583 1726853734.79057: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853734.79156: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853734.79414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853734.82579: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853734.82636: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853734.82776: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853734.82815: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853734.82890: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853734.83113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853734.83149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853734.83385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.83388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853734.83391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853734.83519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853734.83549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853734.83586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.83712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853734.83731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853734.83778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853734.83846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853734.84038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.84041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853734.84044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853734.84400: variable 'network_connections' from source: include params 30583 1726853734.84419: variable 'interface' from source: play vars 30583 1726853734.84605: variable 'interface' from source: play vars 30583 1726853734.84876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853734.85076: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853734.85378: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853734.85381: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853734.85390: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853734.85439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853734.85510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853734.85538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853734.85610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853734.85755: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853734.86273: variable 'network_connections' from source: include params 30583 1726853734.86367: variable 'interface' from source: play vars 30583 1726853734.86433: variable 'interface' from source: play vars 30583 1726853734.86887: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853734.86890: when evaluation is False, skipping this task 30583 1726853734.86893: _execute() done 30583 1726853734.86895: dumping result to json 30583 1726853734.86897: done dumping result, returning 30583 1726853734.86900: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-00000000146c] 30583 1726853734.86911: sending task result for task 02083763-bbaf-05ea-abc5-00000000146c skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853734.87049: no more pending results, returning what we have 30583 1726853734.87053: results queue empty 30583 1726853734.87054: checking for any_errors_fatal 30583 1726853734.87087: done checking for any_errors_fatal 30583 1726853734.87088: checking for max_fail_percentage 30583 1726853734.87090: done checking for max_fail_percentage 30583 1726853734.87091: checking to see if all hosts have failed and the running result is not ok 30583 1726853734.87092: done checking to see if all hosts have failed 30583 1726853734.87093: getting the remaining hosts for this loop 30583 1726853734.87095: done getting the remaining hosts for this loop 30583 1726853734.87099: getting the next task for host managed_node2 30583 1726853734.87109: done getting next task for host managed_node2 30583 1726853734.87114: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853734.87119: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853734.87144: getting variables 30583 1726853734.87146: in VariableManager get_vars() 30583 1726853734.87498: Calling all_inventory to load vars for managed_node2 30583 1726853734.87502: Calling groups_inventory to load vars for managed_node2 30583 1726853734.87504: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853734.87517: Calling all_plugins_play to load vars for managed_node2 30583 1726853734.87520: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853734.87524: Calling groups_plugins_play to load vars for managed_node2 30583 1726853734.88384: done sending task result for task 02083763-bbaf-05ea-abc5-00000000146c 30583 1726853734.88387: WORKER PROCESS EXITING 30583 1726853734.90866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853734.94515: done with get_vars() 30583 1726853734.94548: done getting variables 30583 1726853734.94614: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:35:34 -0400 (0:00:00.175) 0:01:10.283 ****** 30583 1726853734.94652: entering _queue_task() for managed_node2/service 30583 1726853734.95421: worker is 1 (out of 1 available) 30583 1726853734.95436: exiting _queue_task() for managed_node2/service 30583 1726853734.95449: done queuing things up, now waiting for results queue to drain 30583 1726853734.95450: waiting for pending results... 30583 1726853734.95852: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853734.96220: in run() - task 02083763-bbaf-05ea-abc5-00000000146d 30583 1726853734.96243: variable 'ansible_search_path' from source: unknown 30583 1726853734.96577: variable 'ansible_search_path' from source: unknown 30583 1726853734.96581: calling self._execute() 30583 1726853734.96776: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853734.96780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853734.96784: variable 'omit' from source: magic vars 30583 1726853734.97398: variable 'ansible_distribution_major_version' from source: facts 30583 1726853734.97414: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853734.97533: variable 'network_provider' from source: set_fact 30583 1726853734.97784: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853734.97976: when evaluation is False, skipping this task 30583 1726853734.97979: _execute() done 30583 1726853734.97982: dumping result to json 30583 1726853734.97984: done dumping result, returning 30583 1726853734.97987: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-00000000146d] 30583 1726853734.97990: sending task result for task 02083763-bbaf-05ea-abc5-00000000146d 30583 1726853734.98063: done sending task result for task 02083763-bbaf-05ea-abc5-00000000146d 30583 1726853734.98067: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853734.98116: no more pending results, returning what we have 30583 1726853734.98120: results queue empty 30583 1726853734.98121: checking for any_errors_fatal 30583 1726853734.98128: done checking for any_errors_fatal 30583 1726853734.98128: checking for max_fail_percentage 30583 1726853734.98130: done checking for max_fail_percentage 30583 1726853734.98131: checking to see if all hosts have failed and the running result is not ok 30583 1726853734.98132: done checking to see if all hosts have failed 30583 1726853734.98132: getting the remaining hosts for this loop 30583 1726853734.98134: done getting the remaining hosts for this loop 30583 1726853734.98138: getting the next task for host managed_node2 30583 1726853734.98145: done getting next task for host managed_node2 30583 1726853734.98149: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853734.98154: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853734.98176: getting variables 30583 1726853734.98178: in VariableManager get_vars() 30583 1726853734.98212: Calling all_inventory to load vars for managed_node2 30583 1726853734.98214: Calling groups_inventory to load vars for managed_node2 30583 1726853734.98216: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853734.98225: Calling all_plugins_play to load vars for managed_node2 30583 1726853734.98227: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853734.98230: Calling groups_plugins_play to load vars for managed_node2 30583 1726853735.01378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853735.04748: done with get_vars() 30583 1726853735.04782: done getting variables 30583 1726853735.04899: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:35:35 -0400 (0:00:00.102) 0:01:10.386 ****** 30583 1726853735.04948: entering _queue_task() for managed_node2/copy 30583 1726853735.05321: worker is 1 (out of 1 available) 30583 1726853735.05334: exiting _queue_task() for managed_node2/copy 30583 1726853735.05346: done queuing things up, now waiting for results queue to drain 30583 1726853735.05347: waiting for pending results... 30583 1726853735.05666: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853735.05965: in run() - task 02083763-bbaf-05ea-abc5-00000000146e 30583 1726853735.05986: variable 'ansible_search_path' from source: unknown 30583 1726853735.05994: variable 'ansible_search_path' from source: unknown 30583 1726853735.06048: calling self._execute() 30583 1726853735.06152: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.06166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.06182: variable 'omit' from source: magic vars 30583 1726853735.06586: variable 'ansible_distribution_major_version' from source: facts 30583 1726853735.06602: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853735.06727: variable 'network_provider' from source: set_fact 30583 1726853735.06738: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853735.06745: when evaluation is False, skipping this task 30583 1726853735.06752: _execute() done 30583 1726853735.06761: dumping result to json 30583 1726853735.06768: done dumping result, returning 30583 1726853735.06783: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-00000000146e] 30583 1726853735.06800: sending task result for task 02083763-bbaf-05ea-abc5-00000000146e skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853735.06965: no more pending results, returning what we have 30583 1726853735.06970: results queue empty 30583 1726853735.06973: checking for any_errors_fatal 30583 1726853735.06982: done checking for any_errors_fatal 30583 1726853735.06983: checking for max_fail_percentage 30583 1726853735.06985: done checking for max_fail_percentage 30583 1726853735.06986: checking to see if all hosts have failed and the running result is not ok 30583 1726853735.06987: done checking to see if all hosts have failed 30583 1726853735.06988: getting the remaining hosts for this loop 30583 1726853735.06990: done getting the remaining hosts for this loop 30583 1726853735.06994: getting the next task for host managed_node2 30583 1726853735.07003: done getting next task for host managed_node2 30583 1726853735.07119: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853735.07125: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853735.07154: getting variables 30583 1726853735.07156: in VariableManager get_vars() 30583 1726853735.07206: Calling all_inventory to load vars for managed_node2 30583 1726853735.07209: Calling groups_inventory to load vars for managed_node2 30583 1726853735.07211: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853735.07223: Calling all_plugins_play to load vars for managed_node2 30583 1726853735.07226: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853735.07228: Calling groups_plugins_play to load vars for managed_node2 30583 1726853735.07948: done sending task result for task 02083763-bbaf-05ea-abc5-00000000146e 30583 1726853735.07952: WORKER PROCESS EXITING 30583 1726853735.09172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853735.11115: done with get_vars() 30583 1726853735.11136: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:35:35 -0400 (0:00:00.062) 0:01:10.449 ****** 30583 1726853735.11241: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853735.11850: worker is 1 (out of 1 available) 30583 1726853735.11865: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853735.11892: done queuing things up, now waiting for results queue to drain 30583 1726853735.11894: waiting for pending results... 30583 1726853735.12492: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853735.12876: in run() - task 02083763-bbaf-05ea-abc5-00000000146f 30583 1726853735.12881: variable 'ansible_search_path' from source: unknown 30583 1726853735.12884: variable 'ansible_search_path' from source: unknown 30583 1726853735.12886: calling self._execute() 30583 1726853735.12889: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.12892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.12894: variable 'omit' from source: magic vars 30583 1726853735.13444: variable 'ansible_distribution_major_version' from source: facts 30583 1726853735.13467: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853735.13482: variable 'omit' from source: magic vars 30583 1726853735.13549: variable 'omit' from source: magic vars 30583 1726853735.13705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853735.15682: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853735.15758: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853735.15802: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853735.15838: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853735.16104: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853735.16230: variable 'network_provider' from source: set_fact 30583 1726853735.16402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853735.16445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853735.16484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853735.16526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853735.16545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853735.16625: variable 'omit' from source: magic vars 30583 1726853735.16751: variable 'omit' from source: magic vars 30583 1726853735.16851: variable 'network_connections' from source: include params 30583 1726853735.17177: variable 'interface' from source: play vars 30583 1726853735.17180: variable 'interface' from source: play vars 30583 1726853735.17315: variable 'omit' from source: magic vars 30583 1726853735.17676: variable '__lsr_ansible_managed' from source: task vars 30583 1726853735.17679: variable '__lsr_ansible_managed' from source: task vars 30583 1726853735.17947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853735.18521: Loaded config def from plugin (lookup/template) 30583 1726853735.18531: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853735.18598: File lookup term: get_ansible_managed.j2 30583 1726853735.18604: variable 'ansible_search_path' from source: unknown 30583 1726853735.18612: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853735.18626: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853735.18646: variable 'ansible_search_path' from source: unknown 30583 1726853735.25556: variable 'ansible_managed' from source: unknown 30583 1726853735.25694: variable 'omit' from source: magic vars 30583 1726853735.25729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853735.25756: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853735.25778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853735.25794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853735.25804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853735.25837: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853735.25840: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.25843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.25936: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853735.25950: Set connection var ansible_timeout to 10 30583 1726853735.25953: Set connection var ansible_connection to ssh 30583 1726853735.25958: Set connection var ansible_shell_executable to /bin/sh 30583 1726853735.25963: Set connection var ansible_shell_type to sh 30583 1726853735.25975: Set connection var ansible_pipelining to False 30583 1726853735.25999: variable 'ansible_shell_executable' from source: unknown 30583 1726853735.26002: variable 'ansible_connection' from source: unknown 30583 1726853735.26004: variable 'ansible_module_compression' from source: unknown 30583 1726853735.26007: variable 'ansible_shell_type' from source: unknown 30583 1726853735.26009: variable 'ansible_shell_executable' from source: unknown 30583 1726853735.26012: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.26016: variable 'ansible_pipelining' from source: unknown 30583 1726853735.26018: variable 'ansible_timeout' from source: unknown 30583 1726853735.26022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.26156: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853735.26179: variable 'omit' from source: magic vars 30583 1726853735.26185: starting attempt loop 30583 1726853735.26188: running the handler 30583 1726853735.26202: _low_level_execute_command(): starting 30583 1726853735.26208: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853735.27041: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853735.27060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853735.27123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853735.27126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853735.27202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853735.28934: stdout chunk (state=3): >>>/root <<< 30583 1726853735.29168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853735.29173: stdout chunk (state=3): >>><<< 30583 1726853735.29177: stderr chunk (state=3): >>><<< 30583 1726853735.29199: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853735.29303: _low_level_execute_command(): starting 30583 1726853735.29307: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614 `" && echo ansible-tmp-1726853735.292065-33901-49321214909614="` echo /root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614 `" ) && sleep 0' 30583 1726853735.29845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853735.29862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853735.29884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853735.29907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853735.29994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853735.30030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853735.30049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853735.30073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853735.30174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853735.32184: stdout chunk (state=3): >>>ansible-tmp-1726853735.292065-33901-49321214909614=/root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614 <<< 30583 1726853735.32353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853735.32356: stdout chunk (state=3): >>><<< 30583 1726853735.32362: stderr chunk (state=3): >>><<< 30583 1726853735.32576: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853735.292065-33901-49321214909614=/root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853735.32580: variable 'ansible_module_compression' from source: unknown 30583 1726853735.32582: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853735.32585: variable 'ansible_facts' from source: unknown 30583 1726853735.32694: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614/AnsiballZ_network_connections.py 30583 1726853735.32941: Sending initial data 30583 1726853735.32944: Sent initial data (166 bytes) 30583 1726853735.33515: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853735.33529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853735.33542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853735.33562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853735.33683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853735.33706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853735.33814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853735.35477: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853735.35565: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853735.35654: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp3xli1sqv /root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614/AnsiballZ_network_connections.py <<< 30583 1726853735.35658: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614/AnsiballZ_network_connections.py" <<< 30583 1726853735.35735: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp3xli1sqv" to remote "/root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614/AnsiballZ_network_connections.py" <<< 30583 1726853735.36890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853735.36944: stderr chunk (state=3): >>><<< 30583 1726853735.37075: stdout chunk (state=3): >>><<< 30583 1726853735.37078: done transferring module to remote 30583 1726853735.37081: _low_level_execute_command(): starting 30583 1726853735.37083: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614/ /root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614/AnsiballZ_network_connections.py && sleep 0' 30583 1726853735.37685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853735.37730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853735.37743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853735.37758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853735.37859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853735.39777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853735.39781: stdout chunk (state=3): >>><<< 30583 1726853735.39811: stderr chunk (state=3): >>><<< 30583 1726853735.39814: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853735.39817: _low_level_execute_command(): starting 30583 1726853735.39819: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614/AnsiballZ_network_connections.py && sleep 0' 30583 1726853735.40629: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853735.40632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853735.40635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853735.40637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853735.40639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853735.40641: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853735.40643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853735.40645: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853735.40647: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853735.40649: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853735.40650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853735.40652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853735.40655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853735.40656: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853735.40663: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853735.40665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853735.40702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853735.40711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853735.40728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853735.40886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853735.68760: stdout chunk (state=3): >>>Traceback (most recent call last):<<< 30583 1726853735.68768: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y1qc6dsz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y1qc6dsz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/11d9efea-f4e2-4de6-9b17-bfa7490d4840: error=unknown <<< 30583 1726853735.68890: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30583 1726853735.70881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853735.70905: stderr chunk (state=3): >>><<< 30583 1726853735.70908: stdout chunk (state=3): >>><<< 30583 1726853735.70931: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y1qc6dsz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y1qc6dsz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/11d9efea-f4e2-4de6-9b17-bfa7490d4840: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853735.70979: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853735.70988: _low_level_execute_command(): starting 30583 1726853735.70992: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853735.292065-33901-49321214909614/ > /dev/null 2>&1 && sleep 0' 30583 1726853735.71432: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853735.71437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853735.71472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853735.71476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853735.71479: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853735.71481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853735.71529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853735.71532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853735.71535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853735.71613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853735.73530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853735.73556: stderr chunk (state=3): >>><<< 30583 1726853735.73569: stdout chunk (state=3): >>><<< 30583 1726853735.73583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853735.73588: handler run complete 30583 1726853735.73607: attempt loop complete, returning result 30583 1726853735.73610: _execute() done 30583 1726853735.73612: dumping result to json 30583 1726853735.73617: done dumping result, returning 30583 1726853735.73625: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-00000000146f] 30583 1726853735.73630: sending task result for task 02083763-bbaf-05ea-abc5-00000000146f 30583 1726853735.73725: done sending task result for task 02083763-bbaf-05ea-abc5-00000000146f 30583 1726853735.73728: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30583 1726853735.73824: no more pending results, returning what we have 30583 1726853735.73828: results queue empty 30583 1726853735.73829: checking for any_errors_fatal 30583 1726853735.73837: done checking for any_errors_fatal 30583 1726853735.73838: checking for max_fail_percentage 30583 1726853735.73840: done checking for max_fail_percentage 30583 1726853735.73841: checking to see if all hosts have failed and the running result is not ok 30583 1726853735.73842: done checking to see if all hosts have failed 30583 1726853735.73843: getting the remaining hosts for this loop 30583 1726853735.73844: done getting the remaining hosts for this loop 30583 1726853735.73847: getting the next task for host managed_node2 30583 1726853735.73854: done getting next task for host managed_node2 30583 1726853735.73860: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853735.73865: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853735.73877: getting variables 30583 1726853735.73879: in VariableManager get_vars() 30583 1726853735.73917: Calling all_inventory to load vars for managed_node2 30583 1726853735.73920: Calling groups_inventory to load vars for managed_node2 30583 1726853735.73922: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853735.73931: Calling all_plugins_play to load vars for managed_node2 30583 1726853735.73934: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853735.73936: Calling groups_plugins_play to load vars for managed_node2 30583 1726853735.74766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853735.75642: done with get_vars() 30583 1726853735.75661: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:35:35 -0400 (0:00:00.644) 0:01:11.094 ****** 30583 1726853735.75727: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853735.75985: worker is 1 (out of 1 available) 30583 1726853735.76000: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853735.76012: done queuing things up, now waiting for results queue to drain 30583 1726853735.76014: waiting for pending results... 30583 1726853735.76213: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853735.76298: in run() - task 02083763-bbaf-05ea-abc5-000000001470 30583 1726853735.76310: variable 'ansible_search_path' from source: unknown 30583 1726853735.76314: variable 'ansible_search_path' from source: unknown 30583 1726853735.76345: calling self._execute() 30583 1726853735.76431: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.76436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.76444: variable 'omit' from source: magic vars 30583 1726853735.76735: variable 'ansible_distribution_major_version' from source: facts 30583 1726853735.76745: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853735.76833: variable 'network_state' from source: role '' defaults 30583 1726853735.76843: Evaluated conditional (network_state != {}): False 30583 1726853735.76846: when evaluation is False, skipping this task 30583 1726853735.76848: _execute() done 30583 1726853735.76850: dumping result to json 30583 1726853735.76853: done dumping result, returning 30583 1726853735.76861: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-000000001470] 30583 1726853735.76867: sending task result for task 02083763-bbaf-05ea-abc5-000000001470 30583 1726853735.76955: done sending task result for task 02083763-bbaf-05ea-abc5-000000001470 30583 1726853735.76958: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853735.77046: no more pending results, returning what we have 30583 1726853735.77050: results queue empty 30583 1726853735.77051: checking for any_errors_fatal 30583 1726853735.77057: done checking for any_errors_fatal 30583 1726853735.77058: checking for max_fail_percentage 30583 1726853735.77060: done checking for max_fail_percentage 30583 1726853735.77061: checking to see if all hosts have failed and the running result is not ok 30583 1726853735.77062: done checking to see if all hosts have failed 30583 1726853735.77062: getting the remaining hosts for this loop 30583 1726853735.77064: done getting the remaining hosts for this loop 30583 1726853735.77069: getting the next task for host managed_node2 30583 1726853735.77077: done getting next task for host managed_node2 30583 1726853735.77080: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853735.77085: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853735.77103: getting variables 30583 1726853735.77104: in VariableManager get_vars() 30583 1726853735.77135: Calling all_inventory to load vars for managed_node2 30583 1726853735.77137: Calling groups_inventory to load vars for managed_node2 30583 1726853735.77139: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853735.77148: Calling all_plugins_play to load vars for managed_node2 30583 1726853735.77150: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853735.77152: Calling groups_plugins_play to load vars for managed_node2 30583 1726853735.78068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853735.78926: done with get_vars() 30583 1726853735.78942: done getting variables 30583 1726853735.78987: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:35:35 -0400 (0:00:00.032) 0:01:11.127 ****** 30583 1726853735.79012: entering _queue_task() for managed_node2/debug 30583 1726853735.79262: worker is 1 (out of 1 available) 30583 1726853735.79278: exiting _queue_task() for managed_node2/debug 30583 1726853735.79291: done queuing things up, now waiting for results queue to drain 30583 1726853735.79292: waiting for pending results... 30583 1726853735.79487: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853735.79593: in run() - task 02083763-bbaf-05ea-abc5-000000001471 30583 1726853735.79605: variable 'ansible_search_path' from source: unknown 30583 1726853735.79608: variable 'ansible_search_path' from source: unknown 30583 1726853735.79640: calling self._execute() 30583 1726853735.79717: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.79721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.79732: variable 'omit' from source: magic vars 30583 1726853735.80014: variable 'ansible_distribution_major_version' from source: facts 30583 1726853735.80024: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853735.80029: variable 'omit' from source: magic vars 30583 1726853735.80082: variable 'omit' from source: magic vars 30583 1726853735.80106: variable 'omit' from source: magic vars 30583 1726853735.80138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853735.80168: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853735.80188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853735.80201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853735.80210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853735.80233: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853735.80237: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.80239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.80317: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853735.80323: Set connection var ansible_timeout to 10 30583 1726853735.80326: Set connection var ansible_connection to ssh 30583 1726853735.80330: Set connection var ansible_shell_executable to /bin/sh 30583 1726853735.80333: Set connection var ansible_shell_type to sh 30583 1726853735.80340: Set connection var ansible_pipelining to False 30583 1726853735.80358: variable 'ansible_shell_executable' from source: unknown 30583 1726853735.80364: variable 'ansible_connection' from source: unknown 30583 1726853735.80367: variable 'ansible_module_compression' from source: unknown 30583 1726853735.80369: variable 'ansible_shell_type' from source: unknown 30583 1726853735.80373: variable 'ansible_shell_executable' from source: unknown 30583 1726853735.80375: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.80380: variable 'ansible_pipelining' from source: unknown 30583 1726853735.80382: variable 'ansible_timeout' from source: unknown 30583 1726853735.80386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.80490: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853735.80505: variable 'omit' from source: magic vars 30583 1726853735.80508: starting attempt loop 30583 1726853735.80511: running the handler 30583 1726853735.80605: variable '__network_connections_result' from source: set_fact 30583 1726853735.80648: handler run complete 30583 1726853735.80662: attempt loop complete, returning result 30583 1726853735.80665: _execute() done 30583 1726853735.80668: dumping result to json 30583 1726853735.80670: done dumping result, returning 30583 1726853735.80681: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-000000001471] 30583 1726853735.80684: sending task result for task 02083763-bbaf-05ea-abc5-000000001471 30583 1726853735.80769: done sending task result for task 02083763-bbaf-05ea-abc5-000000001471 30583 1726853735.80774: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 30583 1726853735.80843: no more pending results, returning what we have 30583 1726853735.80847: results queue empty 30583 1726853735.80848: checking for any_errors_fatal 30583 1726853735.80855: done checking for any_errors_fatal 30583 1726853735.80855: checking for max_fail_percentage 30583 1726853735.80857: done checking for max_fail_percentage 30583 1726853735.80858: checking to see if all hosts have failed and the running result is not ok 30583 1726853735.80859: done checking to see if all hosts have failed 30583 1726853735.80859: getting the remaining hosts for this loop 30583 1726853735.80861: done getting the remaining hosts for this loop 30583 1726853735.80865: getting the next task for host managed_node2 30583 1726853735.80874: done getting next task for host managed_node2 30583 1726853735.80877: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853735.80887: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853735.80898: getting variables 30583 1726853735.80900: in VariableManager get_vars() 30583 1726853735.80934: Calling all_inventory to load vars for managed_node2 30583 1726853735.80937: Calling groups_inventory to load vars for managed_node2 30583 1726853735.80939: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853735.80947: Calling all_plugins_play to load vars for managed_node2 30583 1726853735.80950: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853735.80952: Calling groups_plugins_play to load vars for managed_node2 30583 1726853735.81743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853735.82615: done with get_vars() 30583 1726853735.82631: done getting variables 30583 1726853735.82677: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:35:35 -0400 (0:00:00.036) 0:01:11.164 ****** 30583 1726853735.82707: entering _queue_task() for managed_node2/debug 30583 1726853735.82944: worker is 1 (out of 1 available) 30583 1726853735.82958: exiting _queue_task() for managed_node2/debug 30583 1726853735.82972: done queuing things up, now waiting for results queue to drain 30583 1726853735.82974: waiting for pending results... 30583 1726853735.83163: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853735.83260: in run() - task 02083763-bbaf-05ea-abc5-000000001472 30583 1726853735.83274: variable 'ansible_search_path' from source: unknown 30583 1726853735.83277: variable 'ansible_search_path' from source: unknown 30583 1726853735.83311: calling self._execute() 30583 1726853735.83385: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.83389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.83398: variable 'omit' from source: magic vars 30583 1726853735.83687: variable 'ansible_distribution_major_version' from source: facts 30583 1726853735.83697: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853735.83702: variable 'omit' from source: magic vars 30583 1726853735.83754: variable 'omit' from source: magic vars 30583 1726853735.83775: variable 'omit' from source: magic vars 30583 1726853735.83808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853735.83836: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853735.83851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853735.83874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853735.83886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853735.83910: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853735.83913: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.83915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.83991: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853735.83994: Set connection var ansible_timeout to 10 30583 1726853735.83998: Set connection var ansible_connection to ssh 30583 1726853735.84003: Set connection var ansible_shell_executable to /bin/sh 30583 1726853735.84006: Set connection var ansible_shell_type to sh 30583 1726853735.84013: Set connection var ansible_pipelining to False 30583 1726853735.84030: variable 'ansible_shell_executable' from source: unknown 30583 1726853735.84033: variable 'ansible_connection' from source: unknown 30583 1726853735.84036: variable 'ansible_module_compression' from source: unknown 30583 1726853735.84039: variable 'ansible_shell_type' from source: unknown 30583 1726853735.84041: variable 'ansible_shell_executable' from source: unknown 30583 1726853735.84043: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.84045: variable 'ansible_pipelining' from source: unknown 30583 1726853735.84048: variable 'ansible_timeout' from source: unknown 30583 1726853735.84052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.84153: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853735.84163: variable 'omit' from source: magic vars 30583 1726853735.84166: starting attempt loop 30583 1726853735.84168: running the handler 30583 1726853735.84213: variable '__network_connections_result' from source: set_fact 30583 1726853735.84267: variable '__network_connections_result' from source: set_fact 30583 1726853735.84341: handler run complete 30583 1726853735.84357: attempt loop complete, returning result 30583 1726853735.84363: _execute() done 30583 1726853735.84365: dumping result to json 30583 1726853735.84367: done dumping result, returning 30583 1726853735.84375: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-000000001472] 30583 1726853735.84379: sending task result for task 02083763-bbaf-05ea-abc5-000000001472 30583 1726853735.84465: done sending task result for task 02083763-bbaf-05ea-abc5-000000001472 30583 1726853735.84467: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30583 1726853735.84579: no more pending results, returning what we have 30583 1726853735.84582: results queue empty 30583 1726853735.84583: checking for any_errors_fatal 30583 1726853735.84587: done checking for any_errors_fatal 30583 1726853735.84587: checking for max_fail_percentage 30583 1726853735.84589: done checking for max_fail_percentage 30583 1726853735.84590: checking to see if all hosts have failed and the running result is not ok 30583 1726853735.84590: done checking to see if all hosts have failed 30583 1726853735.84591: getting the remaining hosts for this loop 30583 1726853735.84592: done getting the remaining hosts for this loop 30583 1726853735.84595: getting the next task for host managed_node2 30583 1726853735.84602: done getting next task for host managed_node2 30583 1726853735.84605: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853735.84609: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853735.84619: getting variables 30583 1726853735.84621: in VariableManager get_vars() 30583 1726853735.84653: Calling all_inventory to load vars for managed_node2 30583 1726853735.84656: Calling groups_inventory to load vars for managed_node2 30583 1726853735.84660: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853735.84667: Calling all_plugins_play to load vars for managed_node2 30583 1726853735.84670: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853735.84680: Calling groups_plugins_play to load vars for managed_node2 30583 1726853735.85578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853735.86441: done with get_vars() 30583 1726853735.86456: done getting variables 30583 1726853735.86502: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:35:35 -0400 (0:00:00.038) 0:01:11.202 ****** 30583 1726853735.86528: entering _queue_task() for managed_node2/debug 30583 1726853735.86775: worker is 1 (out of 1 available) 30583 1726853735.86789: exiting _queue_task() for managed_node2/debug 30583 1726853735.86802: done queuing things up, now waiting for results queue to drain 30583 1726853735.86803: waiting for pending results... 30583 1726853735.86991: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853735.87084: in run() - task 02083763-bbaf-05ea-abc5-000000001473 30583 1726853735.87094: variable 'ansible_search_path' from source: unknown 30583 1726853735.87096: variable 'ansible_search_path' from source: unknown 30583 1726853735.87126: calling self._execute() 30583 1726853735.87206: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.87210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.87218: variable 'omit' from source: magic vars 30583 1726853735.87500: variable 'ansible_distribution_major_version' from source: facts 30583 1726853735.87510: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853735.87592: variable 'network_state' from source: role '' defaults 30583 1726853735.87602: Evaluated conditional (network_state != {}): False 30583 1726853735.87605: when evaluation is False, skipping this task 30583 1726853735.87607: _execute() done 30583 1726853735.87610: dumping result to json 30583 1726853735.87612: done dumping result, returning 30583 1726853735.87620: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-000000001473] 30583 1726853735.87624: sending task result for task 02083763-bbaf-05ea-abc5-000000001473 30583 1726853735.87710: done sending task result for task 02083763-bbaf-05ea-abc5-000000001473 30583 1726853735.87714: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853735.87761: no more pending results, returning what we have 30583 1726853735.87765: results queue empty 30583 1726853735.87766: checking for any_errors_fatal 30583 1726853735.87777: done checking for any_errors_fatal 30583 1726853735.87778: checking for max_fail_percentage 30583 1726853735.87780: done checking for max_fail_percentage 30583 1726853735.87781: checking to see if all hosts have failed and the running result is not ok 30583 1726853735.87781: done checking to see if all hosts have failed 30583 1726853735.87782: getting the remaining hosts for this loop 30583 1726853735.87784: done getting the remaining hosts for this loop 30583 1726853735.87788: getting the next task for host managed_node2 30583 1726853735.87794: done getting next task for host managed_node2 30583 1726853735.87798: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853735.87803: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853735.87825: getting variables 30583 1726853735.87827: in VariableManager get_vars() 30583 1726853735.87863: Calling all_inventory to load vars for managed_node2 30583 1726853735.87865: Calling groups_inventory to load vars for managed_node2 30583 1726853735.87868: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853735.87882: Calling all_plugins_play to load vars for managed_node2 30583 1726853735.87885: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853735.87888: Calling groups_plugins_play to load vars for managed_node2 30583 1726853735.88667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853735.89552: done with get_vars() 30583 1726853735.89570: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:35:35 -0400 (0:00:00.031) 0:01:11.233 ****** 30583 1726853735.89645: entering _queue_task() for managed_node2/ping 30583 1726853735.89902: worker is 1 (out of 1 available) 30583 1726853735.89916: exiting _queue_task() for managed_node2/ping 30583 1726853735.89929: done queuing things up, now waiting for results queue to drain 30583 1726853735.89930: waiting for pending results... 30583 1726853735.90122: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853735.90197: in run() - task 02083763-bbaf-05ea-abc5-000000001474 30583 1726853735.90208: variable 'ansible_search_path' from source: unknown 30583 1726853735.90211: variable 'ansible_search_path' from source: unknown 30583 1726853735.90240: calling self._execute() 30583 1726853735.90321: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.90325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.90333: variable 'omit' from source: magic vars 30583 1726853735.90622: variable 'ansible_distribution_major_version' from source: facts 30583 1726853735.90632: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853735.90638: variable 'omit' from source: magic vars 30583 1726853735.90687: variable 'omit' from source: magic vars 30583 1726853735.90713: variable 'omit' from source: magic vars 30583 1726853735.90745: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853735.90774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853735.90790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853735.90803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853735.90816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853735.90838: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853735.90841: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.90846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.90918: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853735.90922: Set connection var ansible_timeout to 10 30583 1726853735.90924: Set connection var ansible_connection to ssh 30583 1726853735.90934: Set connection var ansible_shell_executable to /bin/sh 30583 1726853735.90936: Set connection var ansible_shell_type to sh 30583 1726853735.90941: Set connection var ansible_pipelining to False 30583 1726853735.90963: variable 'ansible_shell_executable' from source: unknown 30583 1726853735.90966: variable 'ansible_connection' from source: unknown 30583 1726853735.90969: variable 'ansible_module_compression' from source: unknown 30583 1726853735.90973: variable 'ansible_shell_type' from source: unknown 30583 1726853735.90975: variable 'ansible_shell_executable' from source: unknown 30583 1726853735.90977: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853735.90980: variable 'ansible_pipelining' from source: unknown 30583 1726853735.90982: variable 'ansible_timeout' from source: unknown 30583 1726853735.90984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853735.91134: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853735.91145: variable 'omit' from source: magic vars 30583 1726853735.91148: starting attempt loop 30583 1726853735.91150: running the handler 30583 1726853735.91166: _low_level_execute_command(): starting 30583 1726853735.91175: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853735.91677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853735.91681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853735.91693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853735.91731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853735.91745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853735.91834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853735.93574: stdout chunk (state=3): >>>/root <<< 30583 1726853735.93784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853735.93787: stdout chunk (state=3): >>><<< 30583 1726853735.93790: stderr chunk (state=3): >>><<< 30583 1726853735.93793: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853735.93796: _low_level_execute_command(): starting 30583 1726853735.93799: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774 `" && echo ansible-tmp-1726853735.9373944-33938-212259484897774="` echo /root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774 `" ) && sleep 0' 30583 1726853735.94403: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853735.94418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853735.94433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853735.94452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853735.94476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853735.94490: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853735.94589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853735.94685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853735.94765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853735.94793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853735.94903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853735.96925: stdout chunk (state=3): >>>ansible-tmp-1726853735.9373944-33938-212259484897774=/root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774 <<< 30583 1726853735.97053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853735.97056: stdout chunk (state=3): >>><<< 30583 1726853735.97067: stderr chunk (state=3): >>><<< 30583 1726853735.97085: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853735.9373944-33938-212259484897774=/root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853735.97127: variable 'ansible_module_compression' from source: unknown 30583 1726853735.97160: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853735.97196: variable 'ansible_facts' from source: unknown 30583 1726853735.97249: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774/AnsiballZ_ping.py 30583 1726853735.97350: Sending initial data 30583 1726853735.97354: Sent initial data (153 bytes) 30583 1726853735.97803: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853735.97806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853735.97808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853735.97811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853735.97813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853735.97866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853735.97875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853735.97941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853735.99587: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853735.99591: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853735.99651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853735.99720: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpollsxqoy /root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774/AnsiballZ_ping.py <<< 30583 1726853735.99725: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774/AnsiballZ_ping.py" <<< 30583 1726853735.99792: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpollsxqoy" to remote "/root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774/AnsiballZ_ping.py" <<< 30583 1726853735.99796: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774/AnsiballZ_ping.py" <<< 30583 1726853736.00445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853736.00489: stderr chunk (state=3): >>><<< 30583 1726853736.00493: stdout chunk (state=3): >>><<< 30583 1726853736.00517: done transferring module to remote 30583 1726853736.00526: _low_level_execute_command(): starting 30583 1726853736.00531: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774/ /root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774/AnsiballZ_ping.py && sleep 0' 30583 1726853736.00969: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853736.00975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853736.00977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853736.00979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853736.00985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853736.00987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853736.01036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853736.01042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853736.01044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853736.01111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853736.02996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853736.03022: stderr chunk (state=3): >>><<< 30583 1726853736.03025: stdout chunk (state=3): >>><<< 30583 1726853736.03043: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853736.03046: _low_level_execute_command(): starting 30583 1726853736.03051: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774/AnsiballZ_ping.py && sleep 0' 30583 1726853736.03507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853736.03511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853736.03513: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853736.03515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853736.03575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853736.03581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853736.03584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853736.03650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853736.19299: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853736.20766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853736.20770: stdout chunk (state=3): >>><<< 30583 1726853736.20774: stderr chunk (state=3): >>><<< 30583 1726853736.20902: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853736.20907: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853736.20909: _low_level_execute_command(): starting 30583 1726853736.20912: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853735.9373944-33938-212259484897774/ > /dev/null 2>&1 && sleep 0' 30583 1726853736.21453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853736.21468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853736.21525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853736.21591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853736.21608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853736.21646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853736.21750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853736.23677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853736.23733: stderr chunk (state=3): >>><<< 30583 1726853736.23740: stdout chunk (state=3): >>><<< 30583 1726853736.23976: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853736.23980: handler run complete 30583 1726853736.23982: attempt loop complete, returning result 30583 1726853736.23985: _execute() done 30583 1726853736.23987: dumping result to json 30583 1726853736.23989: done dumping result, returning 30583 1726853736.23991: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-000000001474] 30583 1726853736.23993: sending task result for task 02083763-bbaf-05ea-abc5-000000001474 30583 1726853736.24061: done sending task result for task 02083763-bbaf-05ea-abc5-000000001474 30583 1726853736.24064: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853736.24141: no more pending results, returning what we have 30583 1726853736.24145: results queue empty 30583 1726853736.24146: checking for any_errors_fatal 30583 1726853736.24153: done checking for any_errors_fatal 30583 1726853736.24154: checking for max_fail_percentage 30583 1726853736.24156: done checking for max_fail_percentage 30583 1726853736.24157: checking to see if all hosts have failed and the running result is not ok 30583 1726853736.24161: done checking to see if all hosts have failed 30583 1726853736.24161: getting the remaining hosts for this loop 30583 1726853736.24163: done getting the remaining hosts for this loop 30583 1726853736.24167: getting the next task for host managed_node2 30583 1726853736.24181: done getting next task for host managed_node2 30583 1726853736.24183: ^ task is: TASK: meta (role_complete) 30583 1726853736.24189: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853736.24201: getting variables 30583 1726853736.24203: in VariableManager get_vars() 30583 1726853736.24250: Calling all_inventory to load vars for managed_node2 30583 1726853736.24253: Calling groups_inventory to load vars for managed_node2 30583 1726853736.24256: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853736.24269: Calling all_plugins_play to load vars for managed_node2 30583 1726853736.24387: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853736.24392: Calling groups_plugins_play to load vars for managed_node2 30583 1726853736.26040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853736.27707: done with get_vars() 30583 1726853736.27735: done getting variables 30583 1726853736.27839: done queuing things up, now waiting for results queue to drain 30583 1726853736.27841: results queue empty 30583 1726853736.27842: checking for any_errors_fatal 30583 1726853736.27845: done checking for any_errors_fatal 30583 1726853736.27846: checking for max_fail_percentage 30583 1726853736.27847: done checking for max_fail_percentage 30583 1726853736.27847: checking to see if all hosts have failed and the running result is not ok 30583 1726853736.27853: done checking to see if all hosts have failed 30583 1726853736.27853: getting the remaining hosts for this loop 30583 1726853736.27854: done getting the remaining hosts for this loop 30583 1726853736.27860: getting the next task for host managed_node2 30583 1726853736.27866: done getting next task for host managed_node2 30583 1726853736.27869: ^ task is: TASK: Asserts 30583 1726853736.27872: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853736.27876: getting variables 30583 1726853736.27877: in VariableManager get_vars() 30583 1726853736.27889: Calling all_inventory to load vars for managed_node2 30583 1726853736.27891: Calling groups_inventory to load vars for managed_node2 30583 1726853736.27893: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853736.27898: Calling all_plugins_play to load vars for managed_node2 30583 1726853736.27900: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853736.27903: Calling groups_plugins_play to load vars for managed_node2 30583 1726853736.29078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853736.30883: done with get_vars() 30583 1726853736.30905: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 13:35:36 -0400 (0:00:00.413) 0:01:11.647 ****** 30583 1726853736.30985: entering _queue_task() for managed_node2/include_tasks 30583 1726853736.31432: worker is 1 (out of 1 available) 30583 1726853736.31446: exiting _queue_task() for managed_node2/include_tasks 30583 1726853736.31462: done queuing things up, now waiting for results queue to drain 30583 1726853736.31463: waiting for pending results... 30583 1726853736.31777: running TaskExecutor() for managed_node2/TASK: Asserts 30583 1726853736.31846: in run() - task 02083763-bbaf-05ea-abc5-00000000100a 30583 1726853736.31878: variable 'ansible_search_path' from source: unknown 30583 1726853736.31886: variable 'ansible_search_path' from source: unknown 30583 1726853736.31934: variable 'lsr_assert' from source: include params 30583 1726853736.32177: variable 'lsr_assert' from source: include params 30583 1726853736.32253: variable 'omit' from source: magic vars 30583 1726853736.32476: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853736.32480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853736.32483: variable 'omit' from source: magic vars 30583 1726853736.32703: variable 'ansible_distribution_major_version' from source: facts 30583 1726853736.32719: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853736.32730: variable 'item' from source: unknown 30583 1726853736.32805: variable 'item' from source: unknown 30583 1726853736.32839: variable 'item' from source: unknown 30583 1726853736.32912: variable 'item' from source: unknown 30583 1726853736.33288: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853736.33292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853736.33295: variable 'omit' from source: magic vars 30583 1726853736.33297: variable 'ansible_distribution_major_version' from source: facts 30583 1726853736.33301: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853736.33310: variable 'item' from source: unknown 30583 1726853736.33376: variable 'item' from source: unknown 30583 1726853736.33415: variable 'item' from source: unknown 30583 1726853736.33481: variable 'item' from source: unknown 30583 1726853736.33566: dumping result to json 30583 1726853736.33577: done dumping result, returning 30583 1726853736.33613: done running TaskExecutor() for managed_node2/TASK: Asserts [02083763-bbaf-05ea-abc5-00000000100a] 30583 1726853736.33617: sending task result for task 02083763-bbaf-05ea-abc5-00000000100a 30583 1726853736.33741: no more pending results, returning what we have 30583 1726853736.33746: in VariableManager get_vars() 30583 1726853736.33794: Calling all_inventory to load vars for managed_node2 30583 1726853736.33797: Calling groups_inventory to load vars for managed_node2 30583 1726853736.33801: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853736.33814: Calling all_plugins_play to load vars for managed_node2 30583 1726853736.33818: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853736.33820: Calling groups_plugins_play to load vars for managed_node2 30583 1726853736.34485: done sending task result for task 02083763-bbaf-05ea-abc5-00000000100a 30583 1726853736.34488: WORKER PROCESS EXITING 30583 1726853736.35456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853736.37100: done with get_vars() 30583 1726853736.37122: variable 'ansible_search_path' from source: unknown 30583 1726853736.37124: variable 'ansible_search_path' from source: unknown 30583 1726853736.37174: variable 'ansible_search_path' from source: unknown 30583 1726853736.37175: variable 'ansible_search_path' from source: unknown 30583 1726853736.37205: we have included files to process 30583 1726853736.37206: generating all_blocks data 30583 1726853736.37208: done generating all_blocks data 30583 1726853736.37213: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30583 1726853736.37214: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30583 1726853736.37217: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30583 1726853736.37336: in VariableManager get_vars() 30583 1726853736.37365: done with get_vars() 30583 1726853736.37488: done processing included file 30583 1726853736.37490: iterating over new_blocks loaded from include file 30583 1726853736.37492: in VariableManager get_vars() 30583 1726853736.37507: done with get_vars() 30583 1726853736.37508: filtering new block on tags 30583 1726853736.37545: done filtering new block on tags 30583 1726853736.37548: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 => (item=tasks/assert_device_present.yml) 30583 1726853736.37553: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30583 1726853736.37554: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30583 1726853736.37557: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30583 1726853736.37702: in VariableManager get_vars() 30583 1726853736.37721: done with get_vars() 30583 1726853736.37822: done processing included file 30583 1726853736.37824: iterating over new_blocks loaded from include file 30583 1726853736.37825: in VariableManager get_vars() 30583 1726853736.37840: done with get_vars() 30583 1726853736.37842: filtering new block on tags 30583 1726853736.37879: done filtering new block on tags 30583 1726853736.37881: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 => (item=tasks/assert_profile_absent.yml) 30583 1726853736.37884: extending task lists for all hosts with included blocks 30583 1726853736.38886: done extending task lists 30583 1726853736.38887: done processing included files 30583 1726853736.38888: results queue empty 30583 1726853736.38889: checking for any_errors_fatal 30583 1726853736.38891: done checking for any_errors_fatal 30583 1726853736.38892: checking for max_fail_percentage 30583 1726853736.38893: done checking for max_fail_percentage 30583 1726853736.38894: checking to see if all hosts have failed and the running result is not ok 30583 1726853736.38894: done checking to see if all hosts have failed 30583 1726853736.38895: getting the remaining hosts for this loop 30583 1726853736.38896: done getting the remaining hosts for this loop 30583 1726853736.38899: getting the next task for host managed_node2 30583 1726853736.38903: done getting next task for host managed_node2 30583 1726853736.38905: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30583 1726853736.38908: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853736.38916: getting variables 30583 1726853736.38917: in VariableManager get_vars() 30583 1726853736.38927: Calling all_inventory to load vars for managed_node2 30583 1726853736.38929: Calling groups_inventory to load vars for managed_node2 30583 1726853736.38931: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853736.38937: Calling all_plugins_play to load vars for managed_node2 30583 1726853736.38940: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853736.38942: Calling groups_plugins_play to load vars for managed_node2 30583 1726853736.40215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853736.41821: done with get_vars() 30583 1726853736.41845: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:35:36 -0400 (0:00:00.109) 0:01:11.756 ****** 30583 1726853736.41936: entering _queue_task() for managed_node2/include_tasks 30583 1726853736.42401: worker is 1 (out of 1 available) 30583 1726853736.42413: exiting _queue_task() for managed_node2/include_tasks 30583 1726853736.42539: done queuing things up, now waiting for results queue to drain 30583 1726853736.42541: waiting for pending results... 30583 1726853736.42701: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30583 1726853736.42838: in run() - task 02083763-bbaf-05ea-abc5-0000000015cf 30583 1726853736.42865: variable 'ansible_search_path' from source: unknown 30583 1726853736.42879: variable 'ansible_search_path' from source: unknown 30583 1726853736.42921: calling self._execute() 30583 1726853736.43027: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853736.43039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853736.43056: variable 'omit' from source: magic vars 30583 1726853736.43492: variable 'ansible_distribution_major_version' from source: facts 30583 1726853736.43514: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853736.43529: _execute() done 30583 1726853736.43537: dumping result to json 30583 1726853736.43545: done dumping result, returning 30583 1726853736.43555: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-05ea-abc5-0000000015cf] 30583 1726853736.43568: sending task result for task 02083763-bbaf-05ea-abc5-0000000015cf 30583 1726853736.43757: no more pending results, returning what we have 30583 1726853736.43767: in VariableManager get_vars() 30583 1726853736.43811: Calling all_inventory to load vars for managed_node2 30583 1726853736.43815: Calling groups_inventory to load vars for managed_node2 30583 1726853736.43819: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853736.43836: Calling all_plugins_play to load vars for managed_node2 30583 1726853736.43840: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853736.43844: Calling groups_plugins_play to load vars for managed_node2 30583 1726853736.44505: done sending task result for task 02083763-bbaf-05ea-abc5-0000000015cf 30583 1726853736.44513: WORKER PROCESS EXITING 30583 1726853736.45479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853736.47068: done with get_vars() 30583 1726853736.47087: variable 'ansible_search_path' from source: unknown 30583 1726853736.47089: variable 'ansible_search_path' from source: unknown 30583 1726853736.47097: variable 'item' from source: include params 30583 1726853736.47208: variable 'item' from source: include params 30583 1726853736.47247: we have included files to process 30583 1726853736.47248: generating all_blocks data 30583 1726853736.47250: done generating all_blocks data 30583 1726853736.47251: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853736.47252: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853736.47255: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853736.47451: done processing included file 30583 1726853736.47454: iterating over new_blocks loaded from include file 30583 1726853736.47455: in VariableManager get_vars() 30583 1726853736.47476: done with get_vars() 30583 1726853736.47478: filtering new block on tags 30583 1726853736.47505: done filtering new block on tags 30583 1726853736.47508: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30583 1726853736.47513: extending task lists for all hosts with included blocks 30583 1726853736.47694: done extending task lists 30583 1726853736.47696: done processing included files 30583 1726853736.47697: results queue empty 30583 1726853736.47697: checking for any_errors_fatal 30583 1726853736.47702: done checking for any_errors_fatal 30583 1726853736.47702: checking for max_fail_percentage 30583 1726853736.47704: done checking for max_fail_percentage 30583 1726853736.47705: checking to see if all hosts have failed and the running result is not ok 30583 1726853736.47705: done checking to see if all hosts have failed 30583 1726853736.47706: getting the remaining hosts for this loop 30583 1726853736.47707: done getting the remaining hosts for this loop 30583 1726853736.47710: getting the next task for host managed_node2 30583 1726853736.47714: done getting next task for host managed_node2 30583 1726853736.47717: ^ task is: TASK: Get stat for interface {{ interface }} 30583 1726853736.47720: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853736.47723: getting variables 30583 1726853736.47724: in VariableManager get_vars() 30583 1726853736.47733: Calling all_inventory to load vars for managed_node2 30583 1726853736.47735: Calling groups_inventory to load vars for managed_node2 30583 1726853736.47738: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853736.47743: Calling all_plugins_play to load vars for managed_node2 30583 1726853736.47746: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853736.47749: Calling groups_plugins_play to load vars for managed_node2 30583 1726853736.53908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853736.55599: done with get_vars() 30583 1726853736.55624: done getting variables 30583 1726853736.55750: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:35:36 -0400 (0:00:00.138) 0:01:11.895 ****** 30583 1726853736.55782: entering _queue_task() for managed_node2/stat 30583 1726853736.56134: worker is 1 (out of 1 available) 30583 1726853736.56148: exiting _queue_task() for managed_node2/stat 30583 1726853736.56161: done queuing things up, now waiting for results queue to drain 30583 1726853736.56162: waiting for pending results... 30583 1726853736.56448: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30583 1726853736.56611: in run() - task 02083763-bbaf-05ea-abc5-000000001647 30583 1726853736.56634: variable 'ansible_search_path' from source: unknown 30583 1726853736.56644: variable 'ansible_search_path' from source: unknown 30583 1726853736.56691: calling self._execute() 30583 1726853736.56799: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853736.56811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853736.56824: variable 'omit' from source: magic vars 30583 1726853736.57210: variable 'ansible_distribution_major_version' from source: facts 30583 1726853736.57233: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853736.57246: variable 'omit' from source: magic vars 30583 1726853736.57305: variable 'omit' from source: magic vars 30583 1726853736.57439: variable 'interface' from source: play vars 30583 1726853736.57443: variable 'omit' from source: magic vars 30583 1726853736.57482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853736.57522: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853736.57551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853736.57579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853736.57656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853736.57662: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853736.57665: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853736.57668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853736.57747: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853736.57763: Set connection var ansible_timeout to 10 30583 1726853736.57774: Set connection var ansible_connection to ssh 30583 1726853736.57976: Set connection var ansible_shell_executable to /bin/sh 30583 1726853736.57980: Set connection var ansible_shell_type to sh 30583 1726853736.57982: Set connection var ansible_pipelining to False 30583 1726853736.57984: variable 'ansible_shell_executable' from source: unknown 30583 1726853736.57987: variable 'ansible_connection' from source: unknown 30583 1726853736.57989: variable 'ansible_module_compression' from source: unknown 30583 1726853736.57991: variable 'ansible_shell_type' from source: unknown 30583 1726853736.57993: variable 'ansible_shell_executable' from source: unknown 30583 1726853736.57995: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853736.57997: variable 'ansible_pipelining' from source: unknown 30583 1726853736.58000: variable 'ansible_timeout' from source: unknown 30583 1726853736.58003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853736.58069: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853736.58087: variable 'omit' from source: magic vars 30583 1726853736.58097: starting attempt loop 30583 1726853736.58103: running the handler 30583 1726853736.58126: _low_level_execute_command(): starting 30583 1726853736.58139: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853736.58865: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853736.58995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853736.59016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853736.59042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853736.59141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853736.60894: stdout chunk (state=3): >>>/root <<< 30583 1726853736.61038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853736.61052: stdout chunk (state=3): >>><<< 30583 1726853736.61067: stderr chunk (state=3): >>><<< 30583 1726853736.61092: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853736.61108: _low_level_execute_command(): starting 30583 1726853736.61180: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639 `" && echo ansible-tmp-1726853736.6109762-33961-201590194194639="` echo /root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639 `" ) && sleep 0' 30583 1726853736.61816: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853736.61839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853736.61869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853736.61953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853736.62007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853736.62025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853736.62053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853736.62169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853736.64138: stdout chunk (state=3): >>>ansible-tmp-1726853736.6109762-33961-201590194194639=/root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639 <<< 30583 1726853736.64279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853736.64303: stderr chunk (state=3): >>><<< 30583 1726853736.64312: stdout chunk (state=3): >>><<< 30583 1726853736.64340: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853736.6109762-33961-201590194194639=/root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853736.64476: variable 'ansible_module_compression' from source: unknown 30583 1726853736.64480: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853736.64511: variable 'ansible_facts' from source: unknown 30583 1726853736.64616: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639/AnsiballZ_stat.py 30583 1726853736.64844: Sending initial data 30583 1726853736.64857: Sent initial data (153 bytes) 30583 1726853736.65406: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853736.65418: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853736.65488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853736.65533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853736.65548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853736.65568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853736.65666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853736.67374: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853736.67432: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853736.67512: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpceh3_fpw /root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639/AnsiballZ_stat.py <<< 30583 1726853736.67516: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639/AnsiballZ_stat.py" <<< 30583 1726853736.67600: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpceh3_fpw" to remote "/root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639/AnsiballZ_stat.py" <<< 30583 1726853736.68435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853736.68558: stderr chunk (state=3): >>><<< 30583 1726853736.68562: stdout chunk (state=3): >>><<< 30583 1726853736.68574: done transferring module to remote 30583 1726853736.68590: _low_level_execute_command(): starting 30583 1726853736.68599: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639/ /root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639/AnsiballZ_stat.py && sleep 0' 30583 1726853736.69277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853736.69292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853736.69337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853736.69356: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853736.69448: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853736.69494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853736.69568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853736.71578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853736.71582: stdout chunk (state=3): >>><<< 30583 1726853736.71585: stderr chunk (state=3): >>><<< 30583 1726853736.71691: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853736.71695: _low_level_execute_command(): starting 30583 1726853736.71698: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639/AnsiballZ_stat.py && sleep 0' 30583 1726853736.72369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853736.72386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853736.72400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853736.72425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853736.72546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853736.72605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853736.72700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853736.88329: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32151, "dev": 23, "nlink": 1, "atime": 1726853722.328429, "mtime": 1726853722.328429, "ctime": 1726853722.328429, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853736.89741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853736.89768: stderr chunk (state=3): >>><<< 30583 1726853736.89773: stdout chunk (state=3): >>><<< 30583 1726853736.89789: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32151, "dev": 23, "nlink": 1, "atime": 1726853722.328429, "mtime": 1726853722.328429, "ctime": 1726853722.328429, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853736.89829: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853736.89839: _low_level_execute_command(): starting 30583 1726853736.89844: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853736.6109762-33961-201590194194639/ > /dev/null 2>&1 && sleep 0' 30583 1726853736.90276: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853736.90308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853736.90316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853736.90318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853736.90320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853736.90323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853736.90325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853736.90379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853736.90385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853736.90387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853736.90456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853736.92408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853736.92431: stderr chunk (state=3): >>><<< 30583 1726853736.92434: stdout chunk (state=3): >>><<< 30583 1726853736.92446: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853736.92451: handler run complete 30583 1726853736.92489: attempt loop complete, returning result 30583 1726853736.92492: _execute() done 30583 1726853736.92494: dumping result to json 30583 1726853736.92496: done dumping result, returning 30583 1726853736.92503: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [02083763-bbaf-05ea-abc5-000000001647] 30583 1726853736.92507: sending task result for task 02083763-bbaf-05ea-abc5-000000001647 30583 1726853736.92609: done sending task result for task 02083763-bbaf-05ea-abc5-000000001647 30583 1726853736.92612: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726853722.328429, "block_size": 4096, "blocks": 0, "ctime": 1726853722.328429, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 32151, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726853722.328429, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30583 1726853736.92701: no more pending results, returning what we have 30583 1726853736.92705: results queue empty 30583 1726853736.92706: checking for any_errors_fatal 30583 1726853736.92707: done checking for any_errors_fatal 30583 1726853736.92708: checking for max_fail_percentage 30583 1726853736.92710: done checking for max_fail_percentage 30583 1726853736.92711: checking to see if all hosts have failed and the running result is not ok 30583 1726853736.92711: done checking to see if all hosts have failed 30583 1726853736.92712: getting the remaining hosts for this loop 30583 1726853736.92714: done getting the remaining hosts for this loop 30583 1726853736.92717: getting the next task for host managed_node2 30583 1726853736.92728: done getting next task for host managed_node2 30583 1726853736.92731: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30583 1726853736.92734: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853736.92740: getting variables 30583 1726853736.92741: in VariableManager get_vars() 30583 1726853736.92783: Calling all_inventory to load vars for managed_node2 30583 1726853736.92786: Calling groups_inventory to load vars for managed_node2 30583 1726853736.92789: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853736.92799: Calling all_plugins_play to load vars for managed_node2 30583 1726853736.92802: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853736.92804: Calling groups_plugins_play to load vars for managed_node2 30583 1726853736.93702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853736.94568: done with get_vars() 30583 1726853736.94585: done getting variables 30583 1726853736.94631: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853736.94717: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:35:36 -0400 (0:00:00.389) 0:01:12.284 ****** 30583 1726853736.94743: entering _queue_task() for managed_node2/assert 30583 1726853736.94990: worker is 1 (out of 1 available) 30583 1726853736.95004: exiting _queue_task() for managed_node2/assert 30583 1726853736.95017: done queuing things up, now waiting for results queue to drain 30583 1726853736.95019: waiting for pending results... 30583 1726853736.95214: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' 30583 1726853736.95307: in run() - task 02083763-bbaf-05ea-abc5-0000000015d0 30583 1726853736.95318: variable 'ansible_search_path' from source: unknown 30583 1726853736.95322: variable 'ansible_search_path' from source: unknown 30583 1726853736.95350: calling self._execute() 30583 1726853736.95424: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853736.95429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853736.95438: variable 'omit' from source: magic vars 30583 1726853736.95936: variable 'ansible_distribution_major_version' from source: facts 30583 1726853736.95940: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853736.95943: variable 'omit' from source: magic vars 30583 1726853736.95946: variable 'omit' from source: magic vars 30583 1726853736.95956: variable 'interface' from source: play vars 30583 1726853736.95980: variable 'omit' from source: magic vars 30583 1726853736.96022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853736.96057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853736.96083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853736.96102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853736.96114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853736.96153: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853736.96157: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853736.96159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853736.96259: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853736.96267: Set connection var ansible_timeout to 10 30583 1726853736.96270: Set connection var ansible_connection to ssh 30583 1726853736.96277: Set connection var ansible_shell_executable to /bin/sh 30583 1726853736.96280: Set connection var ansible_shell_type to sh 30583 1726853736.96291: Set connection var ansible_pipelining to False 30583 1726853736.96317: variable 'ansible_shell_executable' from source: unknown 30583 1726853736.96320: variable 'ansible_connection' from source: unknown 30583 1726853736.96323: variable 'ansible_module_compression' from source: unknown 30583 1726853736.96325: variable 'ansible_shell_type' from source: unknown 30583 1726853736.96327: variable 'ansible_shell_executable' from source: unknown 30583 1726853736.96329: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853736.96331: variable 'ansible_pipelining' from source: unknown 30583 1726853736.96336: variable 'ansible_timeout' from source: unknown 30583 1726853736.96338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853736.96489: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853736.96501: variable 'omit' from source: magic vars 30583 1726853736.96506: starting attempt loop 30583 1726853736.96509: running the handler 30583 1726853736.96646: variable 'interface_stat' from source: set_fact 30583 1726853736.96668: Evaluated conditional (interface_stat.stat.exists): True 30583 1726853736.96674: handler run complete 30583 1726853736.96696: attempt loop complete, returning result 30583 1726853736.96700: _execute() done 30583 1726853736.96702: dumping result to json 30583 1726853736.96704: done dumping result, returning 30583 1726853736.96707: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' [02083763-bbaf-05ea-abc5-0000000015d0] 30583 1726853736.96709: sending task result for task 02083763-bbaf-05ea-abc5-0000000015d0 30583 1726853736.96796: done sending task result for task 02083763-bbaf-05ea-abc5-0000000015d0 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853736.96974: no more pending results, returning what we have 30583 1726853736.96977: results queue empty 30583 1726853736.96978: checking for any_errors_fatal 30583 1726853736.96985: done checking for any_errors_fatal 30583 1726853736.96985: checking for max_fail_percentage 30583 1726853736.96987: done checking for max_fail_percentage 30583 1726853736.96987: checking to see if all hosts have failed and the running result is not ok 30583 1726853736.96988: done checking to see if all hosts have failed 30583 1726853736.96989: getting the remaining hosts for this loop 30583 1726853736.96990: done getting the remaining hosts for this loop 30583 1726853736.96993: getting the next task for host managed_node2 30583 1726853736.97001: done getting next task for host managed_node2 30583 1726853736.97004: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30583 1726853736.97007: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853736.97011: getting variables 30583 1726853736.97012: in VariableManager get_vars() 30583 1726853736.97043: Calling all_inventory to load vars for managed_node2 30583 1726853736.97046: Calling groups_inventory to load vars for managed_node2 30583 1726853736.97048: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853736.97057: Calling all_plugins_play to load vars for managed_node2 30583 1726853736.97059: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853736.97063: Calling groups_plugins_play to load vars for managed_node2 30583 1726853736.97589: WORKER PROCESS EXITING 30583 1726853736.98463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853737.00214: done with get_vars() 30583 1726853737.00240: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 13:35:37 -0400 (0:00:00.055) 0:01:12.340 ****** 30583 1726853737.00313: entering _queue_task() for managed_node2/include_tasks 30583 1726853737.00548: worker is 1 (out of 1 available) 30583 1726853737.00564: exiting _queue_task() for managed_node2/include_tasks 30583 1726853737.00575: done queuing things up, now waiting for results queue to drain 30583 1726853737.00577: waiting for pending results... 30583 1726853737.00767: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 30583 1726853737.00851: in run() - task 02083763-bbaf-05ea-abc5-0000000015d4 30583 1726853737.00864: variable 'ansible_search_path' from source: unknown 30583 1726853737.00868: variable 'ansible_search_path' from source: unknown 30583 1726853737.00897: calling self._execute() 30583 1726853737.00968: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853737.00973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853737.00982: variable 'omit' from source: magic vars 30583 1726853737.01257: variable 'ansible_distribution_major_version' from source: facts 30583 1726853737.01267: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853737.01274: _execute() done 30583 1726853737.01277: dumping result to json 30583 1726853737.01280: done dumping result, returning 30583 1726853737.01286: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-05ea-abc5-0000000015d4] 30583 1726853737.01291: sending task result for task 02083763-bbaf-05ea-abc5-0000000015d4 30583 1726853737.01385: done sending task result for task 02083763-bbaf-05ea-abc5-0000000015d4 30583 1726853737.01387: WORKER PROCESS EXITING 30583 1726853737.01415: no more pending results, returning what we have 30583 1726853737.01419: in VariableManager get_vars() 30583 1726853737.01463: Calling all_inventory to load vars for managed_node2 30583 1726853737.01465: Calling groups_inventory to load vars for managed_node2 30583 1726853737.01469: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853737.01482: Calling all_plugins_play to load vars for managed_node2 30583 1726853737.01485: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853737.01488: Calling groups_plugins_play to load vars for managed_node2 30583 1726853737.02920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853737.03772: done with get_vars() 30583 1726853737.03785: variable 'ansible_search_path' from source: unknown 30583 1726853737.03786: variable 'ansible_search_path' from source: unknown 30583 1726853737.03792: variable 'item' from source: include params 30583 1726853737.03868: variable 'item' from source: include params 30583 1726853737.03893: we have included files to process 30583 1726853737.03894: generating all_blocks data 30583 1726853737.03895: done generating all_blocks data 30583 1726853737.03899: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853737.03900: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853737.03901: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853737.04509: done processing included file 30583 1726853737.04511: iterating over new_blocks loaded from include file 30583 1726853737.04512: in VariableManager get_vars() 30583 1726853737.04523: done with get_vars() 30583 1726853737.04524: filtering new block on tags 30583 1726853737.04566: done filtering new block on tags 30583 1726853737.04568: in VariableManager get_vars() 30583 1726853737.04580: done with get_vars() 30583 1726853737.04581: filtering new block on tags 30583 1726853737.04613: done filtering new block on tags 30583 1726853737.04615: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 30583 1726853737.04621: extending task lists for all hosts with included blocks 30583 1726853737.04843: done extending task lists 30583 1726853737.04844: done processing included files 30583 1726853737.04845: results queue empty 30583 1726853737.04846: checking for any_errors_fatal 30583 1726853737.04849: done checking for any_errors_fatal 30583 1726853737.04849: checking for max_fail_percentage 30583 1726853737.04850: done checking for max_fail_percentage 30583 1726853737.04851: checking to see if all hosts have failed and the running result is not ok 30583 1726853737.04852: done checking to see if all hosts have failed 30583 1726853737.04853: getting the remaining hosts for this loop 30583 1726853737.04854: done getting the remaining hosts for this loop 30583 1726853737.04856: getting the next task for host managed_node2 30583 1726853737.04864: done getting next task for host managed_node2 30583 1726853737.04867: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30583 1726853737.04870: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853737.04874: getting variables 30583 1726853737.04875: in VariableManager get_vars() 30583 1726853737.04884: Calling all_inventory to load vars for managed_node2 30583 1726853737.04886: Calling groups_inventory to load vars for managed_node2 30583 1726853737.04888: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853737.04894: Calling all_plugins_play to load vars for managed_node2 30583 1726853737.04896: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853737.04898: Calling groups_plugins_play to load vars for managed_node2 30583 1726853737.05944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853737.07592: done with get_vars() 30583 1726853737.07613: done getting variables 30583 1726853737.07654: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:35:37 -0400 (0:00:00.073) 0:01:12.414 ****** 30583 1726853737.07688: entering _queue_task() for managed_node2/set_fact 30583 1726853737.08040: worker is 1 (out of 1 available) 30583 1726853737.08053: exiting _queue_task() for managed_node2/set_fact 30583 1726853737.08068: done queuing things up, now waiting for results queue to drain 30583 1726853737.08070: waiting for pending results... 30583 1726853737.08401: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 30583 1726853737.08480: in run() - task 02083763-bbaf-05ea-abc5-000000001665 30583 1726853737.08484: variable 'ansible_search_path' from source: unknown 30583 1726853737.08487: variable 'ansible_search_path' from source: unknown 30583 1726853737.08505: calling self._execute() 30583 1726853737.08605: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853737.08610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853737.08613: variable 'omit' from source: magic vars 30583 1726853737.09036: variable 'ansible_distribution_major_version' from source: facts 30583 1726853737.09040: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853737.09042: variable 'omit' from source: magic vars 30583 1726853737.09138: variable 'omit' from source: magic vars 30583 1726853737.09277: variable 'omit' from source: magic vars 30583 1726853737.09281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853737.09284: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853737.09294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853737.09318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853737.09336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853737.09379: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853737.09390: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853737.09404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853737.09518: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853737.09530: Set connection var ansible_timeout to 10 30583 1726853737.09537: Set connection var ansible_connection to ssh 30583 1726853737.09548: Set connection var ansible_shell_executable to /bin/sh 30583 1726853737.09555: Set connection var ansible_shell_type to sh 30583 1726853737.09575: Set connection var ansible_pipelining to False 30583 1726853737.09605: variable 'ansible_shell_executable' from source: unknown 30583 1726853737.09621: variable 'ansible_connection' from source: unknown 30583 1726853737.09630: variable 'ansible_module_compression' from source: unknown 30583 1726853737.09637: variable 'ansible_shell_type' from source: unknown 30583 1726853737.09643: variable 'ansible_shell_executable' from source: unknown 30583 1726853737.09649: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853737.09656: variable 'ansible_pipelining' from source: unknown 30583 1726853737.09666: variable 'ansible_timeout' from source: unknown 30583 1726853737.09675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853737.09839: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853737.09947: variable 'omit' from source: magic vars 30583 1726853737.09950: starting attempt loop 30583 1726853737.09953: running the handler 30583 1726853737.09955: handler run complete 30583 1726853737.09957: attempt loop complete, returning result 30583 1726853737.09962: _execute() done 30583 1726853737.09964: dumping result to json 30583 1726853737.09966: done dumping result, returning 30583 1726853737.09968: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-05ea-abc5-000000001665] 30583 1726853737.09972: sending task result for task 02083763-bbaf-05ea-abc5-000000001665 30583 1726853737.10042: done sending task result for task 02083763-bbaf-05ea-abc5-000000001665 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30583 1726853737.10113: no more pending results, returning what we have 30583 1726853737.10117: results queue empty 30583 1726853737.10119: checking for any_errors_fatal 30583 1726853737.10121: done checking for any_errors_fatal 30583 1726853737.10121: checking for max_fail_percentage 30583 1726853737.10124: done checking for max_fail_percentage 30583 1726853737.10125: checking to see if all hosts have failed and the running result is not ok 30583 1726853737.10126: done checking to see if all hosts have failed 30583 1726853737.10126: getting the remaining hosts for this loop 30583 1726853737.10129: done getting the remaining hosts for this loop 30583 1726853737.10133: getting the next task for host managed_node2 30583 1726853737.10143: done getting next task for host managed_node2 30583 1726853737.10147: ^ task is: TASK: Stat profile file 30583 1726853737.10152: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853737.10157: getting variables 30583 1726853737.10162: in VariableManager get_vars() 30583 1726853737.10202: Calling all_inventory to load vars for managed_node2 30583 1726853737.10205: Calling groups_inventory to load vars for managed_node2 30583 1726853737.10209: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853737.10221: Calling all_plugins_play to load vars for managed_node2 30583 1726853737.10225: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853737.10228: Calling groups_plugins_play to load vars for managed_node2 30583 1726853737.10895: WORKER PROCESS EXITING 30583 1726853737.11947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853737.13585: done with get_vars() 30583 1726853737.13615: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:35:37 -0400 (0:00:00.060) 0:01:12.474 ****** 30583 1726853737.13720: entering _queue_task() for managed_node2/stat 30583 1726853737.14093: worker is 1 (out of 1 available) 30583 1726853737.14108: exiting _queue_task() for managed_node2/stat 30583 1726853737.14120: done queuing things up, now waiting for results queue to drain 30583 1726853737.14121: waiting for pending results... 30583 1726853737.14590: running TaskExecutor() for managed_node2/TASK: Stat profile file 30583 1726853737.14594: in run() - task 02083763-bbaf-05ea-abc5-000000001666 30583 1726853737.14597: variable 'ansible_search_path' from source: unknown 30583 1726853737.14600: variable 'ansible_search_path' from source: unknown 30583 1726853737.14610: calling self._execute() 30583 1726853737.14718: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853737.14731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853737.14747: variable 'omit' from source: magic vars 30583 1726853737.15135: variable 'ansible_distribution_major_version' from source: facts 30583 1726853737.15156: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853737.15169: variable 'omit' from source: magic vars 30583 1726853737.15225: variable 'omit' from source: magic vars 30583 1726853737.15340: variable 'profile' from source: play vars 30583 1726853737.15351: variable 'interface' from source: play vars 30583 1726853737.15426: variable 'interface' from source: play vars 30583 1726853737.15455: variable 'omit' from source: magic vars 30583 1726853737.15513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853737.15561: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853737.15676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853737.15679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853737.15682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853737.15684: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853737.15686: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853737.15694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853737.15794: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853737.15812: Set connection var ansible_timeout to 10 30583 1726853737.15820: Set connection var ansible_connection to ssh 30583 1726853737.15830: Set connection var ansible_shell_executable to /bin/sh 30583 1726853737.15838: Set connection var ansible_shell_type to sh 30583 1726853737.15852: Set connection var ansible_pipelining to False 30583 1726853737.15884: variable 'ansible_shell_executable' from source: unknown 30583 1726853737.15893: variable 'ansible_connection' from source: unknown 30583 1726853737.15900: variable 'ansible_module_compression' from source: unknown 30583 1726853737.15912: variable 'ansible_shell_type' from source: unknown 30583 1726853737.15921: variable 'ansible_shell_executable' from source: unknown 30583 1726853737.15928: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853737.15937: variable 'ansible_pipelining' from source: unknown 30583 1726853737.15945: variable 'ansible_timeout' from source: unknown 30583 1726853737.15952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853737.16242: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853737.16246: variable 'omit' from source: magic vars 30583 1726853737.16248: starting attempt loop 30583 1726853737.16251: running the handler 30583 1726853737.16253: _low_level_execute_command(): starting 30583 1726853737.16255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853737.16997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853737.17096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853737.17136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853737.17152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853737.17181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853737.17293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853737.19026: stdout chunk (state=3): >>>/root <<< 30583 1726853737.19188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853737.19191: stdout chunk (state=3): >>><<< 30583 1726853737.19194: stderr chunk (state=3): >>><<< 30583 1726853737.19216: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853737.19311: _low_level_execute_command(): starting 30583 1726853737.19315: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635 `" && echo ansible-tmp-1726853737.1922212-33989-95098230536635="` echo /root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635 `" ) && sleep 0' 30583 1726853737.19867: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853737.19992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853737.20020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853737.20038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853737.20142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853737.22161: stdout chunk (state=3): >>>ansible-tmp-1726853737.1922212-33989-95098230536635=/root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635 <<< 30583 1726853737.22306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853737.22318: stdout chunk (state=3): >>><<< 30583 1726853737.22337: stderr chunk (state=3): >>><<< 30583 1726853737.22476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853737.1922212-33989-95098230536635=/root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853737.22480: variable 'ansible_module_compression' from source: unknown 30583 1726853737.22482: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853737.22524: variable 'ansible_facts' from source: unknown 30583 1726853737.22635: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635/AnsiballZ_stat.py 30583 1726853737.22837: Sending initial data 30583 1726853737.22840: Sent initial data (152 bytes) 30583 1726853737.23529: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853737.23588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853737.23649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853737.23668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853737.23698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853737.23803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853737.25703: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853737.25776: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853737.25853: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp7ikzpoan /root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635/AnsiballZ_stat.py <<< 30583 1726853737.25856: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635/AnsiballZ_stat.py" <<< 30583 1726853737.25917: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp7ikzpoan" to remote "/root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635/AnsiballZ_stat.py" <<< 30583 1726853737.27344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853737.27354: stdout chunk (state=3): >>><<< 30583 1726853737.27368: stderr chunk (state=3): >>><<< 30583 1726853737.27394: done transferring module to remote 30583 1726853737.27407: _low_level_execute_command(): starting 30583 1726853737.27416: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635/ /root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635/AnsiballZ_stat.py && sleep 0' 30583 1726853737.28106: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853737.28185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853737.28223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853737.28241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853737.28256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853737.28362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853737.30516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853737.30520: stdout chunk (state=3): >>><<< 30583 1726853737.30522: stderr chunk (state=3): >>><<< 30583 1726853737.30525: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853737.30533: _low_level_execute_command(): starting 30583 1726853737.30536: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635/AnsiballZ_stat.py && sleep 0' 30583 1726853737.31583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853737.31587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853737.31589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853737.31591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853737.31594: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853737.31645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853737.31742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853737.47255: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853737.48654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853737.48682: stderr chunk (state=3): >>><<< 30583 1726853737.48685: stdout chunk (state=3): >>><<< 30583 1726853737.48701: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853737.48724: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853737.48733: _low_level_execute_command(): starting 30583 1726853737.48738: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853737.1922212-33989-95098230536635/ > /dev/null 2>&1 && sleep 0' 30583 1726853737.49144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853737.49180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853737.49183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853737.49186: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853737.49188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853737.49190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853737.49243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853737.49246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853737.49248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853737.49312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853737.51204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853737.51224: stderr chunk (state=3): >>><<< 30583 1726853737.51227: stdout chunk (state=3): >>><<< 30583 1726853737.51240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853737.51247: handler run complete 30583 1726853737.51265: attempt loop complete, returning result 30583 1726853737.51268: _execute() done 30583 1726853737.51270: dumping result to json 30583 1726853737.51275: done dumping result, returning 30583 1726853737.51284: done running TaskExecutor() for managed_node2/TASK: Stat profile file [02083763-bbaf-05ea-abc5-000000001666] 30583 1726853737.51287: sending task result for task 02083763-bbaf-05ea-abc5-000000001666 30583 1726853737.51387: done sending task result for task 02083763-bbaf-05ea-abc5-000000001666 30583 1726853737.51390: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30583 1726853737.51468: no more pending results, returning what we have 30583 1726853737.51474: results queue empty 30583 1726853737.51475: checking for any_errors_fatal 30583 1726853737.51481: done checking for any_errors_fatal 30583 1726853737.51482: checking for max_fail_percentage 30583 1726853737.51484: done checking for max_fail_percentage 30583 1726853737.51485: checking to see if all hosts have failed and the running result is not ok 30583 1726853737.51485: done checking to see if all hosts have failed 30583 1726853737.51486: getting the remaining hosts for this loop 30583 1726853737.51488: done getting the remaining hosts for this loop 30583 1726853737.51491: getting the next task for host managed_node2 30583 1726853737.51499: done getting next task for host managed_node2 30583 1726853737.51502: ^ task is: TASK: Set NM profile exist flag based on the profile files 30583 1726853737.51506: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853737.51510: getting variables 30583 1726853737.51512: in VariableManager get_vars() 30583 1726853737.51548: Calling all_inventory to load vars for managed_node2 30583 1726853737.51550: Calling groups_inventory to load vars for managed_node2 30583 1726853737.51553: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853737.51564: Calling all_plugins_play to load vars for managed_node2 30583 1726853737.51567: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853737.51569: Calling groups_plugins_play to load vars for managed_node2 30583 1726853737.52546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853737.53407: done with get_vars() 30583 1726853737.53425: done getting variables 30583 1726853737.53473: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:35:37 -0400 (0:00:00.397) 0:01:12.872 ****** 30583 1726853737.53496: entering _queue_task() for managed_node2/set_fact 30583 1726853737.53746: worker is 1 (out of 1 available) 30583 1726853737.53764: exiting _queue_task() for managed_node2/set_fact 30583 1726853737.53778: done queuing things up, now waiting for results queue to drain 30583 1726853737.53780: waiting for pending results... 30583 1726853737.53965: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 30583 1726853737.54047: in run() - task 02083763-bbaf-05ea-abc5-000000001667 30583 1726853737.54057: variable 'ansible_search_path' from source: unknown 30583 1726853737.54063: variable 'ansible_search_path' from source: unknown 30583 1726853737.54093: calling self._execute() 30583 1726853737.54165: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853737.54174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853737.54183: variable 'omit' from source: magic vars 30583 1726853737.54476: variable 'ansible_distribution_major_version' from source: facts 30583 1726853737.54485: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853737.54572: variable 'profile_stat' from source: set_fact 30583 1726853737.54581: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853737.54584: when evaluation is False, skipping this task 30583 1726853737.54586: _execute() done 30583 1726853737.54589: dumping result to json 30583 1726853737.54591: done dumping result, returning 30583 1726853737.54598: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-05ea-abc5-000000001667] 30583 1726853737.54602: sending task result for task 02083763-bbaf-05ea-abc5-000000001667 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853737.54734: no more pending results, returning what we have 30583 1726853737.54738: results queue empty 30583 1726853737.54739: checking for any_errors_fatal 30583 1726853737.54749: done checking for any_errors_fatal 30583 1726853737.54750: checking for max_fail_percentage 30583 1726853737.54752: done checking for max_fail_percentage 30583 1726853737.54753: checking to see if all hosts have failed and the running result is not ok 30583 1726853737.54753: done checking to see if all hosts have failed 30583 1726853737.54754: getting the remaining hosts for this loop 30583 1726853737.54756: done getting the remaining hosts for this loop 30583 1726853737.54762: getting the next task for host managed_node2 30583 1726853737.54770: done getting next task for host managed_node2 30583 1726853737.54774: ^ task is: TASK: Get NM profile info 30583 1726853737.54779: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853737.54784: getting variables 30583 1726853737.54785: in VariableManager get_vars() 30583 1726853737.54818: Calling all_inventory to load vars for managed_node2 30583 1726853737.54821: Calling groups_inventory to load vars for managed_node2 30583 1726853737.54824: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853737.54834: Calling all_plugins_play to load vars for managed_node2 30583 1726853737.54836: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853737.54839: Calling groups_plugins_play to load vars for managed_node2 30583 1726853737.55384: done sending task result for task 02083763-bbaf-05ea-abc5-000000001667 30583 1726853737.55387: WORKER PROCESS EXITING 30583 1726853737.55654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853737.56535: done with get_vars() 30583 1726853737.56551: done getting variables 30583 1726853737.56599: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:35:37 -0400 (0:00:00.031) 0:01:12.903 ****** 30583 1726853737.56625: entering _queue_task() for managed_node2/shell 30583 1726853737.56885: worker is 1 (out of 1 available) 30583 1726853737.56900: exiting _queue_task() for managed_node2/shell 30583 1726853737.56912: done queuing things up, now waiting for results queue to drain 30583 1726853737.56914: waiting for pending results... 30583 1726853737.57096: running TaskExecutor() for managed_node2/TASK: Get NM profile info 30583 1726853737.57186: in run() - task 02083763-bbaf-05ea-abc5-000000001668 30583 1726853737.57199: variable 'ansible_search_path' from source: unknown 30583 1726853737.57202: variable 'ansible_search_path' from source: unknown 30583 1726853737.57229: calling self._execute() 30583 1726853737.57304: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853737.57308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853737.57316: variable 'omit' from source: magic vars 30583 1726853737.57598: variable 'ansible_distribution_major_version' from source: facts 30583 1726853737.57608: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853737.57614: variable 'omit' from source: magic vars 30583 1726853737.57648: variable 'omit' from source: magic vars 30583 1726853737.57720: variable 'profile' from source: play vars 30583 1726853737.57723: variable 'interface' from source: play vars 30583 1726853737.57768: variable 'interface' from source: play vars 30583 1726853737.57784: variable 'omit' from source: magic vars 30583 1726853737.57821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853737.57849: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853737.57865: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853737.57879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853737.57890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853737.57917: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853737.57920: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853737.57922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853737.57990: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853737.57995: Set connection var ansible_timeout to 10 30583 1726853737.57998: Set connection var ansible_connection to ssh 30583 1726853737.58004: Set connection var ansible_shell_executable to /bin/sh 30583 1726853737.58007: Set connection var ansible_shell_type to sh 30583 1726853737.58018: Set connection var ansible_pipelining to False 30583 1726853737.58035: variable 'ansible_shell_executable' from source: unknown 30583 1726853737.58038: variable 'ansible_connection' from source: unknown 30583 1726853737.58040: variable 'ansible_module_compression' from source: unknown 30583 1726853737.58042: variable 'ansible_shell_type' from source: unknown 30583 1726853737.58044: variable 'ansible_shell_executable' from source: unknown 30583 1726853737.58046: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853737.58049: variable 'ansible_pipelining' from source: unknown 30583 1726853737.58052: variable 'ansible_timeout' from source: unknown 30583 1726853737.58056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853737.58160: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853737.58168: variable 'omit' from source: magic vars 30583 1726853737.58174: starting attempt loop 30583 1726853737.58177: running the handler 30583 1726853737.58185: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853737.58201: _low_level_execute_command(): starting 30583 1726853737.58207: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853737.58734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853737.58737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853737.58741: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853737.58744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853737.58800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853737.58804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853737.58813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853737.58885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853737.60613: stdout chunk (state=3): >>>/root <<< 30583 1726853737.60716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853737.60748: stderr chunk (state=3): >>><<< 30583 1726853737.60752: stdout chunk (state=3): >>><<< 30583 1726853737.60775: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853737.60786: _low_level_execute_command(): starting 30583 1726853737.60792: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091 `" && echo ansible-tmp-1726853737.6077518-34015-28884773282091="` echo /root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091 `" ) && sleep 0' 30583 1726853737.61476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853737.61481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853737.61484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853737.61486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853737.61551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853737.63543: stdout chunk (state=3): >>>ansible-tmp-1726853737.6077518-34015-28884773282091=/root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091 <<< 30583 1726853737.63649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853737.63678: stderr chunk (state=3): >>><<< 30583 1726853737.63682: stdout chunk (state=3): >>><<< 30583 1726853737.63698: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853737.6077518-34015-28884773282091=/root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853737.63726: variable 'ansible_module_compression' from source: unknown 30583 1726853737.63800: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853737.63864: variable 'ansible_facts' from source: unknown 30583 1726853737.63917: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091/AnsiballZ_command.py 30583 1726853737.64163: Sending initial data 30583 1726853737.64165: Sent initial data (155 bytes) 30583 1726853737.64773: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853737.64793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853737.64807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853737.64908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853737.66616: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853737.66707: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853737.66817: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpciloiw45 /root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091/AnsiballZ_command.py <<< 30583 1726853737.66821: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091/AnsiballZ_command.py" <<< 30583 1726853737.66921: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpciloiw45" to remote "/root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091/AnsiballZ_command.py" <<< 30583 1726853737.67955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853737.67967: stderr chunk (state=3): >>><<< 30583 1726853737.68103: stdout chunk (state=3): >>><<< 30583 1726853737.68132: done transferring module to remote 30583 1726853737.68178: _low_level_execute_command(): starting 30583 1726853737.68181: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091/ /root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091/AnsiballZ_command.py && sleep 0' 30583 1726853737.68752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853737.68764: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853737.68777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853737.68791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853737.68941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853737.68957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853737.68960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853737.69025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853737.70937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853737.70990: stderr chunk (state=3): >>><<< 30583 1726853737.71009: stdout chunk (state=3): >>><<< 30583 1726853737.71115: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853737.71119: _low_level_execute_command(): starting 30583 1726853737.71121: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091/AnsiballZ_command.py && sleep 0' 30583 1726853737.71666: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853737.71682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853737.71694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853737.71726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853737.71741: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853737.71832: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853737.71874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853737.71951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853737.89673: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 13:35:37.878389", "end": "2024-09-20 13:35:37.895585", "delta": "0:00:00.017196", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853737.91233: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. <<< 30583 1726853737.91260: stderr chunk (state=3): >>><<< 30583 1726853737.91263: stdout chunk (state=3): >>><<< 30583 1726853737.91285: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 13:35:37.878389", "end": "2024-09-20 13:35:37.895585", "delta": "0:00:00.017196", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. 30583 1726853737.91316: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853737.91323: _low_level_execute_command(): starting 30583 1726853737.91328: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853737.6077518-34015-28884773282091/ > /dev/null 2>&1 && sleep 0' 30583 1726853737.91764: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853737.91774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853737.91801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853737.91804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853737.91811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853737.91813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853737.91862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853737.91865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853737.92002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853737.93909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853737.93922: stdout chunk (state=3): >>><<< 30583 1726853737.93934: stderr chunk (state=3): >>><<< 30583 1726853737.93958: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853737.93973: handler run complete 30583 1726853737.94001: Evaluated conditional (False): False 30583 1726853737.94024: attempt loop complete, returning result 30583 1726853737.94027: _execute() done 30583 1726853737.94029: dumping result to json 30583 1726853737.94077: done dumping result, returning 30583 1726853737.94080: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [02083763-bbaf-05ea-abc5-000000001668] 30583 1726853737.94082: sending task result for task 02083763-bbaf-05ea-abc5-000000001668 fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.017196", "end": "2024-09-20 13:35:37.895585", "rc": 1, "start": "2024-09-20 13:35:37.878389" } MSG: non-zero return code ...ignoring 30583 1726853737.94352: no more pending results, returning what we have 30583 1726853737.94357: results queue empty 30583 1726853737.94358: checking for any_errors_fatal 30583 1726853737.94365: done checking for any_errors_fatal 30583 1726853737.94366: checking for max_fail_percentage 30583 1726853737.94368: done checking for max_fail_percentage 30583 1726853737.94369: checking to see if all hosts have failed and the running result is not ok 30583 1726853737.94370: done checking to see if all hosts have failed 30583 1726853737.94373: getting the remaining hosts for this loop 30583 1726853737.94375: done getting the remaining hosts for this loop 30583 1726853737.94379: getting the next task for host managed_node2 30583 1726853737.94387: done getting next task for host managed_node2 30583 1726853737.94389: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30583 1726853737.94394: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853737.94398: getting variables 30583 1726853737.94399: in VariableManager get_vars() 30583 1726853737.94436: Calling all_inventory to load vars for managed_node2 30583 1726853737.94439: Calling groups_inventory to load vars for managed_node2 30583 1726853737.94443: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853737.94455: Calling all_plugins_play to load vars for managed_node2 30583 1726853737.94458: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853737.94461: Calling groups_plugins_play to load vars for managed_node2 30583 1726853737.94985: done sending task result for task 02083763-bbaf-05ea-abc5-000000001668 30583 1726853737.94989: WORKER PROCESS EXITING 30583 1726853737.96205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853737.97522: done with get_vars() 30583 1726853737.97539: done getting variables 30583 1726853737.97590: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:35:37 -0400 (0:00:00.409) 0:01:13.313 ****** 30583 1726853737.97614: entering _queue_task() for managed_node2/set_fact 30583 1726853737.97863: worker is 1 (out of 1 available) 30583 1726853737.97879: exiting _queue_task() for managed_node2/set_fact 30583 1726853737.97891: done queuing things up, now waiting for results queue to drain 30583 1726853737.97892: waiting for pending results... 30583 1726853737.98083: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30583 1726853737.98172: in run() - task 02083763-bbaf-05ea-abc5-000000001669 30583 1726853737.98183: variable 'ansible_search_path' from source: unknown 30583 1726853737.98186: variable 'ansible_search_path' from source: unknown 30583 1726853737.98215: calling self._execute() 30583 1726853737.98292: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853737.98296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853737.98305: variable 'omit' from source: magic vars 30583 1726853737.98593: variable 'ansible_distribution_major_version' from source: facts 30583 1726853737.98603: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853737.98695: variable 'nm_profile_exists' from source: set_fact 30583 1726853737.98706: Evaluated conditional (nm_profile_exists.rc == 0): False 30583 1726853737.98710: when evaluation is False, skipping this task 30583 1726853737.98713: _execute() done 30583 1726853737.98715: dumping result to json 30583 1726853737.98718: done dumping result, returning 30583 1726853737.98724: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-05ea-abc5-000000001669] 30583 1726853737.98727: sending task result for task 02083763-bbaf-05ea-abc5-000000001669 30583 1726853737.98807: done sending task result for task 02083763-bbaf-05ea-abc5-000000001669 30583 1726853737.98810: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30583 1726853737.98852: no more pending results, returning what we have 30583 1726853737.98855: results queue empty 30583 1726853737.98856: checking for any_errors_fatal 30583 1726853737.98867: done checking for any_errors_fatal 30583 1726853737.98868: checking for max_fail_percentage 30583 1726853737.98870: done checking for max_fail_percentage 30583 1726853737.98873: checking to see if all hosts have failed and the running result is not ok 30583 1726853737.98873: done checking to see if all hosts have failed 30583 1726853737.98874: getting the remaining hosts for this loop 30583 1726853737.98876: done getting the remaining hosts for this loop 30583 1726853737.98880: getting the next task for host managed_node2 30583 1726853737.98890: done getting next task for host managed_node2 30583 1726853737.98892: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30583 1726853737.98899: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853737.98903: getting variables 30583 1726853737.98905: in VariableManager get_vars() 30583 1726853737.98939: Calling all_inventory to load vars for managed_node2 30583 1726853737.98941: Calling groups_inventory to load vars for managed_node2 30583 1726853737.98945: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853737.98955: Calling all_plugins_play to load vars for managed_node2 30583 1726853737.98957: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853737.98959: Calling groups_plugins_play to load vars for managed_node2 30583 1726853738.00251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853738.01903: done with get_vars() 30583 1726853738.01928: done getting variables 30583 1726853738.01996: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853738.02122: variable 'profile' from source: play vars 30583 1726853738.02126: variable 'interface' from source: play vars 30583 1726853738.02186: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:35:38 -0400 (0:00:00.046) 0:01:13.359 ****** 30583 1726853738.02221: entering _queue_task() for managed_node2/command 30583 1726853738.02695: worker is 1 (out of 1 available) 30583 1726853738.02707: exiting _queue_task() for managed_node2/command 30583 1726853738.02718: done queuing things up, now waiting for results queue to drain 30583 1726853738.02719: waiting for pending results... 30583 1726853738.03428: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 30583 1726853738.03484: in run() - task 02083763-bbaf-05ea-abc5-00000000166b 30583 1726853738.03631: variable 'ansible_search_path' from source: unknown 30583 1726853738.03635: variable 'ansible_search_path' from source: unknown 30583 1726853738.03638: calling self._execute() 30583 1726853738.03869: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.03884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.03898: variable 'omit' from source: magic vars 30583 1726853738.04708: variable 'ansible_distribution_major_version' from source: facts 30583 1726853738.04778: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853738.04967: variable 'profile_stat' from source: set_fact 30583 1726853738.04984: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853738.04992: when evaluation is False, skipping this task 30583 1726853738.04998: _execute() done 30583 1726853738.05005: dumping result to json 30583 1726853738.05025: done dumping result, returning 30583 1726853738.05037: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-00000000166b] 30583 1726853738.05045: sending task result for task 02083763-bbaf-05ea-abc5-00000000166b skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853738.05215: no more pending results, returning what we have 30583 1726853738.05219: results queue empty 30583 1726853738.05220: checking for any_errors_fatal 30583 1726853738.05228: done checking for any_errors_fatal 30583 1726853738.05229: checking for max_fail_percentage 30583 1726853738.05231: done checking for max_fail_percentage 30583 1726853738.05232: checking to see if all hosts have failed and the running result is not ok 30583 1726853738.05233: done checking to see if all hosts have failed 30583 1726853738.05233: getting the remaining hosts for this loop 30583 1726853738.05235: done getting the remaining hosts for this loop 30583 1726853738.05239: getting the next task for host managed_node2 30583 1726853738.05248: done getting next task for host managed_node2 30583 1726853738.05251: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30583 1726853738.05257: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853738.05262: getting variables 30583 1726853738.05263: in VariableManager get_vars() 30583 1726853738.05305: Calling all_inventory to load vars for managed_node2 30583 1726853738.05308: Calling groups_inventory to load vars for managed_node2 30583 1726853738.05312: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853738.05327: Calling all_plugins_play to load vars for managed_node2 30583 1726853738.05331: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853738.05334: Calling groups_plugins_play to load vars for managed_node2 30583 1726853738.05984: done sending task result for task 02083763-bbaf-05ea-abc5-00000000166b 30583 1726853738.05988: WORKER PROCESS EXITING 30583 1726853738.09563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853738.12853: done with get_vars() 30583 1726853738.13095: done getting variables 30583 1726853738.13164: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853738.13486: variable 'profile' from source: play vars 30583 1726853738.13490: variable 'interface' from source: play vars 30583 1726853738.13551: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:35:38 -0400 (0:00:00.113) 0:01:13.473 ****** 30583 1726853738.13590: entering _queue_task() for managed_node2/set_fact 30583 1726853738.14374: worker is 1 (out of 1 available) 30583 1726853738.14386: exiting _queue_task() for managed_node2/set_fact 30583 1726853738.14397: done queuing things up, now waiting for results queue to drain 30583 1726853738.14398: waiting for pending results... 30583 1726853738.14947: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 30583 1726853738.15174: in run() - task 02083763-bbaf-05ea-abc5-00000000166c 30583 1726853738.15222: variable 'ansible_search_path' from source: unknown 30583 1726853738.15257: variable 'ansible_search_path' from source: unknown 30583 1726853738.15312: calling self._execute() 30583 1726853738.15877: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.16276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.16280: variable 'omit' from source: magic vars 30583 1726853738.16869: variable 'ansible_distribution_major_version' from source: facts 30583 1726853738.17276: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853738.17279: variable 'profile_stat' from source: set_fact 30583 1726853738.17283: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853738.17286: when evaluation is False, skipping this task 30583 1726853738.17289: _execute() done 30583 1726853738.17292: dumping result to json 30583 1726853738.17294: done dumping result, returning 30583 1726853738.17298: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-00000000166c] 30583 1726853738.17300: sending task result for task 02083763-bbaf-05ea-abc5-00000000166c 30583 1726853738.17378: done sending task result for task 02083763-bbaf-05ea-abc5-00000000166c skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853738.17428: no more pending results, returning what we have 30583 1726853738.17433: results queue empty 30583 1726853738.17434: checking for any_errors_fatal 30583 1726853738.17441: done checking for any_errors_fatal 30583 1726853738.17442: checking for max_fail_percentage 30583 1726853738.17444: done checking for max_fail_percentage 30583 1726853738.17445: checking to see if all hosts have failed and the running result is not ok 30583 1726853738.17446: done checking to see if all hosts have failed 30583 1726853738.17447: getting the remaining hosts for this loop 30583 1726853738.17449: done getting the remaining hosts for this loop 30583 1726853738.17453: getting the next task for host managed_node2 30583 1726853738.17464: done getting next task for host managed_node2 30583 1726853738.17467: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30583 1726853738.17475: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853738.17480: getting variables 30583 1726853738.17482: in VariableManager get_vars() 30583 1726853738.17521: Calling all_inventory to load vars for managed_node2 30583 1726853738.17524: Calling groups_inventory to load vars for managed_node2 30583 1726853738.17528: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853738.17541: Calling all_plugins_play to load vars for managed_node2 30583 1726853738.17545: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853738.17548: Calling groups_plugins_play to load vars for managed_node2 30583 1726853738.18285: WORKER PROCESS EXITING 30583 1726853738.20886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853738.24028: done with get_vars() 30583 1726853738.24050: done getting variables 30583 1726853738.24111: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853738.24232: variable 'profile' from source: play vars 30583 1726853738.24237: variable 'interface' from source: play vars 30583 1726853738.24299: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:35:38 -0400 (0:00:00.107) 0:01:13.580 ****** 30583 1726853738.24332: entering _queue_task() for managed_node2/command 30583 1726853738.24718: worker is 1 (out of 1 available) 30583 1726853738.24732: exiting _queue_task() for managed_node2/command 30583 1726853738.24743: done queuing things up, now waiting for results queue to drain 30583 1726853738.24744: waiting for pending results... 30583 1726853738.25060: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 30583 1726853738.25246: in run() - task 02083763-bbaf-05ea-abc5-00000000166d 30583 1726853738.25268: variable 'ansible_search_path' from source: unknown 30583 1726853738.25287: variable 'ansible_search_path' from source: unknown 30583 1726853738.25510: calling self._execute() 30583 1726853738.25617: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.25693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.25710: variable 'omit' from source: magic vars 30583 1726853738.26118: variable 'ansible_distribution_major_version' from source: facts 30583 1726853738.26138: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853738.26268: variable 'profile_stat' from source: set_fact 30583 1726853738.26290: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853738.26299: when evaluation is False, skipping this task 30583 1726853738.26306: _execute() done 30583 1726853738.26314: dumping result to json 30583 1726853738.26321: done dumping result, returning 30583 1726853738.26336: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-00000000166d] 30583 1726853738.26346: sending task result for task 02083763-bbaf-05ea-abc5-00000000166d skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853738.26496: no more pending results, returning what we have 30583 1726853738.26500: results queue empty 30583 1726853738.26502: checking for any_errors_fatal 30583 1726853738.26510: done checking for any_errors_fatal 30583 1726853738.26511: checking for max_fail_percentage 30583 1726853738.26513: done checking for max_fail_percentage 30583 1726853738.26514: checking to see if all hosts have failed and the running result is not ok 30583 1726853738.26515: done checking to see if all hosts have failed 30583 1726853738.26516: getting the remaining hosts for this loop 30583 1726853738.26518: done getting the remaining hosts for this loop 30583 1726853738.26521: getting the next task for host managed_node2 30583 1726853738.26530: done getting next task for host managed_node2 30583 1726853738.26533: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30583 1726853738.26539: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853738.26543: getting variables 30583 1726853738.26544: in VariableManager get_vars() 30583 1726853738.26590: Calling all_inventory to load vars for managed_node2 30583 1726853738.26593: Calling groups_inventory to load vars for managed_node2 30583 1726853738.26597: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853738.26610: Calling all_plugins_play to load vars for managed_node2 30583 1726853738.26614: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853738.26618: Calling groups_plugins_play to load vars for managed_node2 30583 1726853738.27313: done sending task result for task 02083763-bbaf-05ea-abc5-00000000166d 30583 1726853738.27316: WORKER PROCESS EXITING 30583 1726853738.28362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853738.30044: done with get_vars() 30583 1726853738.30077: done getting variables 30583 1726853738.30139: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853738.30257: variable 'profile' from source: play vars 30583 1726853738.30261: variable 'interface' from source: play vars 30583 1726853738.30320: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:35:38 -0400 (0:00:00.060) 0:01:13.640 ****** 30583 1726853738.30358: entering _queue_task() for managed_node2/set_fact 30583 1726853738.30812: worker is 1 (out of 1 available) 30583 1726853738.30824: exiting _queue_task() for managed_node2/set_fact 30583 1726853738.30836: done queuing things up, now waiting for results queue to drain 30583 1726853738.30837: waiting for pending results... 30583 1726853738.31310: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 30583 1726853738.31519: in run() - task 02083763-bbaf-05ea-abc5-00000000166e 30583 1726853738.31557: variable 'ansible_search_path' from source: unknown 30583 1726853738.31560: variable 'ansible_search_path' from source: unknown 30583 1726853738.31575: calling self._execute() 30583 1726853738.31693: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.31696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.31776: variable 'omit' from source: magic vars 30583 1726853738.32054: variable 'ansible_distribution_major_version' from source: facts 30583 1726853738.32067: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853738.32238: variable 'profile_stat' from source: set_fact 30583 1726853738.32255: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853738.32262: when evaluation is False, skipping this task 30583 1726853738.32270: _execute() done 30583 1726853738.32280: dumping result to json 30583 1726853738.32288: done dumping result, returning 30583 1726853738.32299: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-00000000166e] 30583 1726853738.32315: sending task result for task 02083763-bbaf-05ea-abc5-00000000166e skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853738.32468: no more pending results, returning what we have 30583 1726853738.32476: results queue empty 30583 1726853738.32477: checking for any_errors_fatal 30583 1726853738.32485: done checking for any_errors_fatal 30583 1726853738.32486: checking for max_fail_percentage 30583 1726853738.32489: done checking for max_fail_percentage 30583 1726853738.32490: checking to see if all hosts have failed and the running result is not ok 30583 1726853738.32491: done checking to see if all hosts have failed 30583 1726853738.32492: getting the remaining hosts for this loop 30583 1726853738.32494: done getting the remaining hosts for this loop 30583 1726853738.32499: getting the next task for host managed_node2 30583 1726853738.32509: done getting next task for host managed_node2 30583 1726853738.32512: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30583 1726853738.32516: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853738.32524: getting variables 30583 1726853738.32526: in VariableManager get_vars() 30583 1726853738.32566: Calling all_inventory to load vars for managed_node2 30583 1726853738.32569: Calling groups_inventory to load vars for managed_node2 30583 1726853738.32792: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853738.32804: Calling all_plugins_play to load vars for managed_node2 30583 1726853738.32808: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853738.32811: Calling groups_plugins_play to load vars for managed_node2 30583 1726853738.33434: done sending task result for task 02083763-bbaf-05ea-abc5-00000000166e 30583 1726853738.33439: WORKER PROCESS EXITING 30583 1726853738.35686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853738.37686: done with get_vars() 30583 1726853738.37712: done getting variables 30583 1726853738.37778: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853738.37896: variable 'profile' from source: play vars 30583 1726853738.37900: variable 'interface' from source: play vars 30583 1726853738.37954: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 13:35:38 -0400 (0:00:00.076) 0:01:13.717 ****** 30583 1726853738.37985: entering _queue_task() for managed_node2/assert 30583 1726853738.38354: worker is 1 (out of 1 available) 30583 1726853738.38368: exiting _queue_task() for managed_node2/assert 30583 1726853738.38495: done queuing things up, now waiting for results queue to drain 30583 1726853738.38497: waiting for pending results... 30583 1726853738.39096: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' 30583 1726853738.39116: in run() - task 02083763-bbaf-05ea-abc5-0000000015d5 30583 1726853738.39137: variable 'ansible_search_path' from source: unknown 30583 1726853738.39144: variable 'ansible_search_path' from source: unknown 30583 1726853738.39408: calling self._execute() 30583 1726853738.39412: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.39485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.39500: variable 'omit' from source: magic vars 30583 1726853738.40236: variable 'ansible_distribution_major_version' from source: facts 30583 1726853738.40266: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853738.40285: variable 'omit' from source: magic vars 30583 1726853738.40336: variable 'omit' from source: magic vars 30583 1726853738.40444: variable 'profile' from source: play vars 30583 1726853738.40461: variable 'interface' from source: play vars 30583 1726853738.40534: variable 'interface' from source: play vars 30583 1726853738.40560: variable 'omit' from source: magic vars 30583 1726853738.40615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853738.40657: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853738.40685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853738.40711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853738.40732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853738.40767: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853738.40778: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.40787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.40902: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853738.40914: Set connection var ansible_timeout to 10 30583 1726853738.40920: Set connection var ansible_connection to ssh 30583 1726853738.40934: Set connection var ansible_shell_executable to /bin/sh 30583 1726853738.40939: Set connection var ansible_shell_type to sh 30583 1726853738.40951: Set connection var ansible_pipelining to False 30583 1726853738.40978: variable 'ansible_shell_executable' from source: unknown 30583 1726853738.40984: variable 'ansible_connection' from source: unknown 30583 1726853738.40990: variable 'ansible_module_compression' from source: unknown 30583 1726853738.40996: variable 'ansible_shell_type' from source: unknown 30583 1726853738.41001: variable 'ansible_shell_executable' from source: unknown 30583 1726853738.41006: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.41012: variable 'ansible_pipelining' from source: unknown 30583 1726853738.41017: variable 'ansible_timeout' from source: unknown 30583 1726853738.41023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.41159: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853738.41258: variable 'omit' from source: magic vars 30583 1726853738.41261: starting attempt loop 30583 1726853738.41263: running the handler 30583 1726853738.41305: variable 'lsr_net_profile_exists' from source: set_fact 30583 1726853738.41314: Evaluated conditional (not lsr_net_profile_exists): True 30583 1726853738.41323: handler run complete 30583 1726853738.41338: attempt loop complete, returning result 30583 1726853738.41344: _execute() done 30583 1726853738.41349: dumping result to json 30583 1726853738.41354: done dumping result, returning 30583 1726853738.41372: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' [02083763-bbaf-05ea-abc5-0000000015d5] 30583 1726853738.41380: sending task result for task 02083763-bbaf-05ea-abc5-0000000015d5 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853738.41520: no more pending results, returning what we have 30583 1726853738.41524: results queue empty 30583 1726853738.41525: checking for any_errors_fatal 30583 1726853738.41531: done checking for any_errors_fatal 30583 1726853738.41532: checking for max_fail_percentage 30583 1726853738.41534: done checking for max_fail_percentage 30583 1726853738.41535: checking to see if all hosts have failed and the running result is not ok 30583 1726853738.41536: done checking to see if all hosts have failed 30583 1726853738.41537: getting the remaining hosts for this loop 30583 1726853738.41539: done getting the remaining hosts for this loop 30583 1726853738.41543: getting the next task for host managed_node2 30583 1726853738.41552: done getting next task for host managed_node2 30583 1726853738.41556: ^ task is: TASK: Conditional asserts 30583 1726853738.41559: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853738.41564: getting variables 30583 1726853738.41566: in VariableManager get_vars() 30583 1726853738.41604: Calling all_inventory to load vars for managed_node2 30583 1726853738.41607: Calling groups_inventory to load vars for managed_node2 30583 1726853738.41610: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853738.41621: Calling all_plugins_play to load vars for managed_node2 30583 1726853738.41624: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853738.41627: Calling groups_plugins_play to load vars for managed_node2 30583 1726853738.42403: done sending task result for task 02083763-bbaf-05ea-abc5-0000000015d5 30583 1726853738.42406: WORKER PROCESS EXITING 30583 1726853738.43384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853738.44951: done with get_vars() 30583 1726853738.44981: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 13:35:38 -0400 (0:00:00.070) 0:01:13.787 ****** 30583 1726853738.45078: entering _queue_task() for managed_node2/include_tasks 30583 1726853738.45430: worker is 1 (out of 1 available) 30583 1726853738.45444: exiting _queue_task() for managed_node2/include_tasks 30583 1726853738.45456: done queuing things up, now waiting for results queue to drain 30583 1726853738.45457: waiting for pending results... 30583 1726853738.45753: running TaskExecutor() for managed_node2/TASK: Conditional asserts 30583 1726853738.45873: in run() - task 02083763-bbaf-05ea-abc5-00000000100b 30583 1726853738.45899: variable 'ansible_search_path' from source: unknown 30583 1726853738.45907: variable 'ansible_search_path' from source: unknown 30583 1726853738.46195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853738.48755: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853738.48840: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853738.48887: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853738.48938: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853738.48973: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853738.49068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853738.49109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853738.49141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853738.49198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853738.49261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853738.49387: dumping result to json 30583 1726853738.49396: done dumping result, returning 30583 1726853738.49406: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [02083763-bbaf-05ea-abc5-00000000100b] 30583 1726853738.49417: sending task result for task 02083763-bbaf-05ea-abc5-00000000100b 30583 1726853738.49640: done sending task result for task 02083763-bbaf-05ea-abc5-00000000100b 30583 1726853738.49643: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } 30583 1726853738.49701: no more pending results, returning what we have 30583 1726853738.49706: results queue empty 30583 1726853738.49707: checking for any_errors_fatal 30583 1726853738.49714: done checking for any_errors_fatal 30583 1726853738.49715: checking for max_fail_percentage 30583 1726853738.49717: done checking for max_fail_percentage 30583 1726853738.49718: checking to see if all hosts have failed and the running result is not ok 30583 1726853738.49719: done checking to see if all hosts have failed 30583 1726853738.49720: getting the remaining hosts for this loop 30583 1726853738.49722: done getting the remaining hosts for this loop 30583 1726853738.49727: getting the next task for host managed_node2 30583 1726853738.49735: done getting next task for host managed_node2 30583 1726853738.49737: ^ task is: TASK: Success in test '{{ lsr_description }}' 30583 1726853738.49741: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853738.49745: getting variables 30583 1726853738.49747: in VariableManager get_vars() 30583 1726853738.49900: Calling all_inventory to load vars for managed_node2 30583 1726853738.49904: Calling groups_inventory to load vars for managed_node2 30583 1726853738.49907: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853738.49919: Calling all_plugins_play to load vars for managed_node2 30583 1726853738.49923: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853738.49926: Calling groups_plugins_play to load vars for managed_node2 30583 1726853738.51636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853738.53180: done with get_vars() 30583 1726853738.53206: done getting variables 30583 1726853738.53270: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853738.53391: variable 'lsr_description' from source: include params TASK [Success in test 'I can remove an existing profile without taking it down'] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 13:35:38 -0400 (0:00:00.083) 0:01:13.871 ****** 30583 1726853738.53423: entering _queue_task() for managed_node2/debug 30583 1726853738.53990: worker is 1 (out of 1 available) 30583 1726853738.54001: exiting _queue_task() for managed_node2/debug 30583 1726853738.54011: done queuing things up, now waiting for results queue to drain 30583 1726853738.54012: waiting for pending results... 30583 1726853738.54253: running TaskExecutor() for managed_node2/TASK: Success in test 'I can remove an existing profile without taking it down' 30583 1726853738.54259: in run() - task 02083763-bbaf-05ea-abc5-00000000100c 30583 1726853738.54262: variable 'ansible_search_path' from source: unknown 30583 1726853738.54265: variable 'ansible_search_path' from source: unknown 30583 1726853738.54300: calling self._execute() 30583 1726853738.54402: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.54413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.54429: variable 'omit' from source: magic vars 30583 1726853738.54817: variable 'ansible_distribution_major_version' from source: facts 30583 1726853738.54835: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853738.54847: variable 'omit' from source: magic vars 30583 1726853738.54900: variable 'omit' from source: magic vars 30583 1726853738.55076: variable 'lsr_description' from source: include params 30583 1726853738.55079: variable 'omit' from source: magic vars 30583 1726853738.55082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853738.55121: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853738.55149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853738.55170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853738.55189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853738.55233: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853738.55243: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.55251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.55363: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853738.55378: Set connection var ansible_timeout to 10 30583 1726853738.55385: Set connection var ansible_connection to ssh 30583 1726853738.55395: Set connection var ansible_shell_executable to /bin/sh 30583 1726853738.55400: Set connection var ansible_shell_type to sh 30583 1726853738.55410: Set connection var ansible_pipelining to False 30583 1726853738.55439: variable 'ansible_shell_executable' from source: unknown 30583 1726853738.55543: variable 'ansible_connection' from source: unknown 30583 1726853738.55546: variable 'ansible_module_compression' from source: unknown 30583 1726853738.55548: variable 'ansible_shell_type' from source: unknown 30583 1726853738.55550: variable 'ansible_shell_executable' from source: unknown 30583 1726853738.55552: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.55554: variable 'ansible_pipelining' from source: unknown 30583 1726853738.55556: variable 'ansible_timeout' from source: unknown 30583 1726853738.55557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.55621: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853738.55636: variable 'omit' from source: magic vars 30583 1726853738.55652: starting attempt loop 30583 1726853738.55658: running the handler 30583 1726853738.55705: handler run complete 30583 1726853738.55720: attempt loop complete, returning result 30583 1726853738.55726: _execute() done 30583 1726853738.55731: dumping result to json 30583 1726853738.55736: done dumping result, returning 30583 1726853738.55746: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can remove an existing profile without taking it down' [02083763-bbaf-05ea-abc5-00000000100c] 30583 1726853738.55758: sending task result for task 02083763-bbaf-05ea-abc5-00000000100c ok: [managed_node2] => {} MSG: +++++ Success in test 'I can remove an existing profile without taking it down' +++++ 30583 1726853738.55910: no more pending results, returning what we have 30583 1726853738.55914: results queue empty 30583 1726853738.55915: checking for any_errors_fatal 30583 1726853738.55925: done checking for any_errors_fatal 30583 1726853738.55926: checking for max_fail_percentage 30583 1726853738.55928: done checking for max_fail_percentage 30583 1726853738.55929: checking to see if all hosts have failed and the running result is not ok 30583 1726853738.55930: done checking to see if all hosts have failed 30583 1726853738.55930: getting the remaining hosts for this loop 30583 1726853738.55932: done getting the remaining hosts for this loop 30583 1726853738.55936: getting the next task for host managed_node2 30583 1726853738.55945: done getting next task for host managed_node2 30583 1726853738.55949: ^ task is: TASK: Cleanup 30583 1726853738.55952: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853738.55958: getting variables 30583 1726853738.55959: in VariableManager get_vars() 30583 1726853738.55998: Calling all_inventory to load vars for managed_node2 30583 1726853738.56000: Calling groups_inventory to load vars for managed_node2 30583 1726853738.56004: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853738.56016: Calling all_plugins_play to load vars for managed_node2 30583 1726853738.56020: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853738.56023: Calling groups_plugins_play to load vars for managed_node2 30583 1726853738.56714: done sending task result for task 02083763-bbaf-05ea-abc5-00000000100c 30583 1726853738.56717: WORKER PROCESS EXITING 30583 1726853738.57705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853738.59481: done with get_vars() 30583 1726853738.59505: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 13:35:38 -0400 (0:00:00.061) 0:01:13.933 ****** 30583 1726853738.59610: entering _queue_task() for managed_node2/include_tasks 30583 1726853738.60101: worker is 1 (out of 1 available) 30583 1726853738.60114: exiting _queue_task() for managed_node2/include_tasks 30583 1726853738.60125: done queuing things up, now waiting for results queue to drain 30583 1726853738.60127: waiting for pending results... 30583 1726853738.60335: running TaskExecutor() for managed_node2/TASK: Cleanup 30583 1726853738.60465: in run() - task 02083763-bbaf-05ea-abc5-000000001010 30583 1726853738.60487: variable 'ansible_search_path' from source: unknown 30583 1726853738.60496: variable 'ansible_search_path' from source: unknown 30583 1726853738.60550: variable 'lsr_cleanup' from source: include params 30583 1726853738.60764: variable 'lsr_cleanup' from source: include params 30583 1726853738.60849: variable 'omit' from source: magic vars 30583 1726853738.61005: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.61021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.61039: variable 'omit' from source: magic vars 30583 1726853738.61403: variable 'ansible_distribution_major_version' from source: facts 30583 1726853738.61406: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853738.61409: variable 'item' from source: unknown 30583 1726853738.61412: variable 'item' from source: unknown 30583 1726853738.61442: variable 'item' from source: unknown 30583 1726853738.61512: variable 'item' from source: unknown 30583 1726853738.61805: dumping result to json 30583 1726853738.61809: done dumping result, returning 30583 1726853738.61811: done running TaskExecutor() for managed_node2/TASK: Cleanup [02083763-bbaf-05ea-abc5-000000001010] 30583 1726853738.61813: sending task result for task 02083763-bbaf-05ea-abc5-000000001010 30583 1726853738.61853: done sending task result for task 02083763-bbaf-05ea-abc5-000000001010 30583 1726853738.61855: WORKER PROCESS EXITING 30583 1726853738.61881: no more pending results, returning what we have 30583 1726853738.61886: in VariableManager get_vars() 30583 1726853738.61928: Calling all_inventory to load vars for managed_node2 30583 1726853738.61932: Calling groups_inventory to load vars for managed_node2 30583 1726853738.61936: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853738.61951: Calling all_plugins_play to load vars for managed_node2 30583 1726853738.61954: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853738.61957: Calling groups_plugins_play to load vars for managed_node2 30583 1726853738.63516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853738.65038: done with get_vars() 30583 1726853738.65062: variable 'ansible_search_path' from source: unknown 30583 1726853738.65063: variable 'ansible_search_path' from source: unknown 30583 1726853738.65107: we have included files to process 30583 1726853738.65108: generating all_blocks data 30583 1726853738.65110: done generating all_blocks data 30583 1726853738.65116: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853738.65117: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853738.65119: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853738.65323: done processing included file 30583 1726853738.65325: iterating over new_blocks loaded from include file 30583 1726853738.65327: in VariableManager get_vars() 30583 1726853738.65344: done with get_vars() 30583 1726853738.65346: filtering new block on tags 30583 1726853738.65377: done filtering new block on tags 30583 1726853738.65380: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 30583 1726853738.65385: extending task lists for all hosts with included blocks 30583 1726853738.66628: done extending task lists 30583 1726853738.66630: done processing included files 30583 1726853738.66630: results queue empty 30583 1726853738.66631: checking for any_errors_fatal 30583 1726853738.66636: done checking for any_errors_fatal 30583 1726853738.66637: checking for max_fail_percentage 30583 1726853738.66638: done checking for max_fail_percentage 30583 1726853738.66639: checking to see if all hosts have failed and the running result is not ok 30583 1726853738.66640: done checking to see if all hosts have failed 30583 1726853738.66640: getting the remaining hosts for this loop 30583 1726853738.66642: done getting the remaining hosts for this loop 30583 1726853738.66644: getting the next task for host managed_node2 30583 1726853738.66649: done getting next task for host managed_node2 30583 1726853738.66651: ^ task is: TASK: Cleanup profile and device 30583 1726853738.66654: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853738.66657: getting variables 30583 1726853738.66658: in VariableManager get_vars() 30583 1726853738.66670: Calling all_inventory to load vars for managed_node2 30583 1726853738.66674: Calling groups_inventory to load vars for managed_node2 30583 1726853738.66676: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853738.66683: Calling all_plugins_play to load vars for managed_node2 30583 1726853738.66685: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853738.66688: Calling groups_plugins_play to load vars for managed_node2 30583 1726853738.73540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853738.75052: done with get_vars() 30583 1726853738.75086: done getting variables 30583 1726853738.75131: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 13:35:38 -0400 (0:00:00.155) 0:01:14.088 ****** 30583 1726853738.75159: entering _queue_task() for managed_node2/shell 30583 1726853738.75531: worker is 1 (out of 1 available) 30583 1726853738.75543: exiting _queue_task() for managed_node2/shell 30583 1726853738.75556: done queuing things up, now waiting for results queue to drain 30583 1726853738.75558: waiting for pending results... 30583 1726853738.76073: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 30583 1726853738.76078: in run() - task 02083763-bbaf-05ea-abc5-0000000016ad 30583 1726853738.76081: variable 'ansible_search_path' from source: unknown 30583 1726853738.76083: variable 'ansible_search_path' from source: unknown 30583 1726853738.76086: calling self._execute() 30583 1726853738.76183: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.76195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.76208: variable 'omit' from source: magic vars 30583 1726853738.76613: variable 'ansible_distribution_major_version' from source: facts 30583 1726853738.76631: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853738.76642: variable 'omit' from source: magic vars 30583 1726853738.76698: variable 'omit' from source: magic vars 30583 1726853738.76860: variable 'interface' from source: play vars 30583 1726853738.76927: variable 'omit' from source: magic vars 30583 1726853738.76940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853738.76982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853738.77008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853738.77036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853738.77054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853738.77176: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853738.77179: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.77182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.77213: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853738.77224: Set connection var ansible_timeout to 10 30583 1726853738.77230: Set connection var ansible_connection to ssh 30583 1726853738.77239: Set connection var ansible_shell_executable to /bin/sh 30583 1726853738.77246: Set connection var ansible_shell_type to sh 30583 1726853738.77258: Set connection var ansible_pipelining to False 30583 1726853738.77288: variable 'ansible_shell_executable' from source: unknown 30583 1726853738.77377: variable 'ansible_connection' from source: unknown 30583 1726853738.77380: variable 'ansible_module_compression' from source: unknown 30583 1726853738.77382: variable 'ansible_shell_type' from source: unknown 30583 1726853738.77385: variable 'ansible_shell_executable' from source: unknown 30583 1726853738.77386: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853738.77388: variable 'ansible_pipelining' from source: unknown 30583 1726853738.77391: variable 'ansible_timeout' from source: unknown 30583 1726853738.77394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853738.77486: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853738.77512: variable 'omit' from source: magic vars 30583 1726853738.77523: starting attempt loop 30583 1726853738.77530: running the handler 30583 1726853738.77545: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853738.77619: _low_level_execute_command(): starting 30583 1726853738.77622: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853738.78338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853738.78392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853738.78439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853738.78501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853738.78534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853738.78577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853738.78664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853738.80434: stdout chunk (state=3): >>>/root <<< 30583 1726853738.80601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853738.80605: stdout chunk (state=3): >>><<< 30583 1726853738.80607: stderr chunk (state=3): >>><<< 30583 1726853738.80737: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853738.80741: _low_level_execute_command(): starting 30583 1726853738.80744: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447 `" && echo ansible-tmp-1726853738.8063538-34065-155847096367447="` echo /root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447 `" ) && sleep 0' 30583 1726853738.81299: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853738.81349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853738.81420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853738.83421: stdout chunk (state=3): >>>ansible-tmp-1726853738.8063538-34065-155847096367447=/root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447 <<< 30583 1726853738.83677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853738.83680: stdout chunk (state=3): >>><<< 30583 1726853738.83683: stderr chunk (state=3): >>><<< 30583 1726853738.83686: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853738.8063538-34065-155847096367447=/root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853738.83689: variable 'ansible_module_compression' from source: unknown 30583 1726853738.83720: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853738.83768: variable 'ansible_facts' from source: unknown 30583 1726853738.83877: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447/AnsiballZ_command.py 30583 1726853738.84048: Sending initial data 30583 1726853738.84051: Sent initial data (156 bytes) 30583 1726853738.84802: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853738.84868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853738.84889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853738.84919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853738.85025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853738.86684: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853738.86761: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853738.86847: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpjag330f1 /root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447/AnsiballZ_command.py <<< 30583 1726853738.86850: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447/AnsiballZ_command.py" <<< 30583 1726853738.86930: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpjag330f1" to remote "/root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447/AnsiballZ_command.py" <<< 30583 1726853738.87764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853738.87976: stderr chunk (state=3): >>><<< 30583 1726853738.87980: stdout chunk (state=3): >>><<< 30583 1726853738.87982: done transferring module to remote 30583 1726853738.87985: _low_level_execute_command(): starting 30583 1726853738.87987: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447/ /root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447/AnsiballZ_command.py && sleep 0' 30583 1726853738.88453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853738.88560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853738.88791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853738.88961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853738.90865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853738.90869: stdout chunk (state=3): >>><<< 30583 1726853738.90873: stderr chunk (state=3): >>><<< 30583 1726853738.91077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853738.91086: _low_level_execute_command(): starting 30583 1726853738.91089: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447/AnsiballZ_command.py && sleep 0' 30583 1726853738.91703: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853738.91733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853738.92061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853738.92085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853738.92090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853738.92205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853738.92291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853739.16882: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (11d9efea-f4e2-4de6-9b17-bfa7490d4840) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 13:35:39.078001", "end": "2024-09-20 13:35:39.164920", "delta": "0:00:00.086919", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853739.18279: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. <<< 30583 1726853739.18283: stdout chunk (state=3): >>><<< 30583 1726853739.18285: stderr chunk (state=3): >>><<< 30583 1726853739.18310: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (11d9efea-f4e2-4de6-9b17-bfa7490d4840) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 13:35:39.078001", "end": "2024-09-20 13:35:39.164920", "delta": "0:00:00.086919", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. 30583 1726853739.18352: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853739.18557: _low_level_execute_command(): starting 30583 1726853739.18563: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853738.8063538-34065-155847096367447/ > /dev/null 2>&1 && sleep 0' 30583 1726853739.19676: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853739.19777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853739.19879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853739.20029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853739.20070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853739.20140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853739.22135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853739.22146: stdout chunk (state=3): >>><<< 30583 1726853739.22174: stderr chunk (state=3): >>><<< 30583 1726853739.22196: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853739.22220: handler run complete 30583 1726853739.22294: Evaluated conditional (False): False 30583 1726853739.22320: attempt loop complete, returning result 30583 1726853739.22478: _execute() done 30583 1726853739.22483: dumping result to json 30583 1726853739.22485: done dumping result, returning 30583 1726853739.22487: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [02083763-bbaf-05ea-abc5-0000000016ad] 30583 1726853739.22490: sending task result for task 02083763-bbaf-05ea-abc5-0000000016ad 30583 1726853739.22569: done sending task result for task 02083763-bbaf-05ea-abc5-0000000016ad 30583 1726853739.22651: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.086919", "end": "2024-09-20 13:35:39.164920", "rc": 1, "start": "2024-09-20 13:35:39.078001" } STDOUT: Connection 'statebr' (11d9efea-f4e2-4de6-9b17-bfa7490d4840) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30583 1726853739.22724: no more pending results, returning what we have 30583 1726853739.22729: results queue empty 30583 1726853739.22731: checking for any_errors_fatal 30583 1726853739.22733: done checking for any_errors_fatal 30583 1726853739.22734: checking for max_fail_percentage 30583 1726853739.22736: done checking for max_fail_percentage 30583 1726853739.22737: checking to see if all hosts have failed and the running result is not ok 30583 1726853739.22738: done checking to see if all hosts have failed 30583 1726853739.22739: getting the remaining hosts for this loop 30583 1726853739.22741: done getting the remaining hosts for this loop 30583 1726853739.22744: getting the next task for host managed_node2 30583 1726853739.22760: done getting next task for host managed_node2 30583 1726853739.22764: ^ task is: TASK: Include the task 'run_test.yml' 30583 1726853739.22766: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853739.22772: getting variables 30583 1726853739.22774: in VariableManager get_vars() 30583 1726853739.22812: Calling all_inventory to load vars for managed_node2 30583 1726853739.22816: Calling groups_inventory to load vars for managed_node2 30583 1726853739.22820: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853739.22830: Calling all_plugins_play to load vars for managed_node2 30583 1726853739.22834: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853739.22836: Calling groups_plugins_play to load vars for managed_node2 30583 1726853739.26160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853739.29345: done with get_vars() 30583 1726853739.29380: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:102 Friday 20 September 2024 13:35:39 -0400 (0:00:00.544) 0:01:14.633 ****** 30583 1726853739.29618: entering _queue_task() for managed_node2/include_tasks 30583 1726853739.30110: worker is 1 (out of 1 available) 30583 1726853739.30124: exiting _queue_task() for managed_node2/include_tasks 30583 1726853739.30137: done queuing things up, now waiting for results queue to drain 30583 1726853739.30138: waiting for pending results... 30583 1726853739.30385: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 30583 1726853739.30433: in run() - task 02083763-bbaf-05ea-abc5-000000000015 30583 1726853739.30481: variable 'ansible_search_path' from source: unknown 30583 1726853739.30485: calling self._execute() 30583 1726853739.30589: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.30596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.30606: variable 'omit' from source: magic vars 30583 1726853739.31077: variable 'ansible_distribution_major_version' from source: facts 30583 1726853739.31081: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853739.31084: _execute() done 30583 1726853739.31087: dumping result to json 30583 1726853739.31089: done dumping result, returning 30583 1726853739.31091: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [02083763-bbaf-05ea-abc5-000000000015] 30583 1726853739.31094: sending task result for task 02083763-bbaf-05ea-abc5-000000000015 30583 1726853739.31305: no more pending results, returning what we have 30583 1726853739.31309: in VariableManager get_vars() 30583 1726853739.31343: Calling all_inventory to load vars for managed_node2 30583 1726853739.31346: Calling groups_inventory to load vars for managed_node2 30583 1726853739.31348: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853739.31360: Calling all_plugins_play to load vars for managed_node2 30583 1726853739.31363: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853739.31366: Calling groups_plugins_play to load vars for managed_node2 30583 1726853739.32022: done sending task result for task 02083763-bbaf-05ea-abc5-000000000015 30583 1726853739.32025: WORKER PROCESS EXITING 30583 1726853739.33530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853739.35382: done with get_vars() 30583 1726853739.35416: variable 'ansible_search_path' from source: unknown 30583 1726853739.35437: we have included files to process 30583 1726853739.35442: generating all_blocks data 30583 1726853739.35444: done generating all_blocks data 30583 1726853739.35451: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853739.35452: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853739.35455: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853739.35942: in VariableManager get_vars() 30583 1726853739.35965: done with get_vars() 30583 1726853739.36007: in VariableManager get_vars() 30583 1726853739.36024: done with get_vars() 30583 1726853739.36072: in VariableManager get_vars() 30583 1726853739.36088: done with get_vars() 30583 1726853739.36127: in VariableManager get_vars() 30583 1726853739.36146: done with get_vars() 30583 1726853739.36188: in VariableManager get_vars() 30583 1726853739.36207: done with get_vars() 30583 1726853739.36649: in VariableManager get_vars() 30583 1726853739.36670: done with get_vars() 30583 1726853739.36684: done processing included file 30583 1726853739.36685: iterating over new_blocks loaded from include file 30583 1726853739.36690: in VariableManager get_vars() 30583 1726853739.36703: done with get_vars() 30583 1726853739.36705: filtering new block on tags 30583 1726853739.36812: done filtering new block on tags 30583 1726853739.36816: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 30583 1726853739.36821: extending task lists for all hosts with included blocks 30583 1726853739.36854: done extending task lists 30583 1726853739.36855: done processing included files 30583 1726853739.36856: results queue empty 30583 1726853739.36857: checking for any_errors_fatal 30583 1726853739.36866: done checking for any_errors_fatal 30583 1726853739.36867: checking for max_fail_percentage 30583 1726853739.36868: done checking for max_fail_percentage 30583 1726853739.36869: checking to see if all hosts have failed and the running result is not ok 30583 1726853739.36870: done checking to see if all hosts have failed 30583 1726853739.36870: getting the remaining hosts for this loop 30583 1726853739.36873: done getting the remaining hosts for this loop 30583 1726853739.36876: getting the next task for host managed_node2 30583 1726853739.36880: done getting next task for host managed_node2 30583 1726853739.36882: ^ task is: TASK: TEST: {{ lsr_description }} 30583 1726853739.36885: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853739.36888: getting variables 30583 1726853739.36888: in VariableManager get_vars() 30583 1726853739.36898: Calling all_inventory to load vars for managed_node2 30583 1726853739.36900: Calling groups_inventory to load vars for managed_node2 30583 1726853739.36903: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853739.37080: Calling all_plugins_play to load vars for managed_node2 30583 1726853739.37083: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853739.37085: Calling groups_plugins_play to load vars for managed_node2 30583 1726853739.38985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853739.41833: done with get_vars() 30583 1726853739.41864: done getting variables 30583 1726853739.41975: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853739.42216: variable 'lsr_description' from source: include params TASK [TEST: I can take a profile down that is absent] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 13:35:39 -0400 (0:00:00.126) 0:01:14.759 ****** 30583 1726853739.42265: entering _queue_task() for managed_node2/debug 30583 1726853739.43022: worker is 1 (out of 1 available) 30583 1726853739.43034: exiting _queue_task() for managed_node2/debug 30583 1726853739.43047: done queuing things up, now waiting for results queue to drain 30583 1726853739.43048: waiting for pending results... 30583 1726853739.43624: running TaskExecutor() for managed_node2/TASK: TEST: I can take a profile down that is absent 30583 1726853739.43743: in run() - task 02083763-bbaf-05ea-abc5-000000001744 30583 1726853739.43747: variable 'ansible_search_path' from source: unknown 30583 1726853739.43752: variable 'ansible_search_path' from source: unknown 30583 1726853739.43852: calling self._execute() 30583 1726853739.43879: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.43885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.43896: variable 'omit' from source: magic vars 30583 1726853739.44262: variable 'ansible_distribution_major_version' from source: facts 30583 1726853739.44274: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853739.44285: variable 'omit' from source: magic vars 30583 1726853739.44397: variable 'omit' from source: magic vars 30583 1726853739.44420: variable 'lsr_description' from source: include params 30583 1726853739.44440: variable 'omit' from source: magic vars 30583 1726853739.44486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853739.44523: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853739.44544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853739.44563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.44575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.44613: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853739.44616: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.44618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.44722: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853739.44725: Set connection var ansible_timeout to 10 30583 1726853739.44728: Set connection var ansible_connection to ssh 30583 1726853739.44730: Set connection var ansible_shell_executable to /bin/sh 30583 1726853739.44732: Set connection var ansible_shell_type to sh 30583 1726853739.44744: Set connection var ansible_pipelining to False 30583 1726853739.44767: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.44772: variable 'ansible_connection' from source: unknown 30583 1726853739.44775: variable 'ansible_module_compression' from source: unknown 30583 1726853739.44777: variable 'ansible_shell_type' from source: unknown 30583 1726853739.44780: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.44782: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.44784: variable 'ansible_pipelining' from source: unknown 30583 1726853739.44786: variable 'ansible_timeout' from source: unknown 30583 1726853739.44790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.44929: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853739.44939: variable 'omit' from source: magic vars 30583 1726853739.44945: starting attempt loop 30583 1726853739.44948: running the handler 30583 1726853739.44997: handler run complete 30583 1726853739.45010: attempt loop complete, returning result 30583 1726853739.45012: _execute() done 30583 1726853739.45015: dumping result to json 30583 1726853739.45017: done dumping result, returning 30583 1726853739.45025: done running TaskExecutor() for managed_node2/TASK: TEST: I can take a profile down that is absent [02083763-bbaf-05ea-abc5-000000001744] 30583 1726853739.45029: sending task result for task 02083763-bbaf-05ea-abc5-000000001744 30583 1726853739.45119: done sending task result for task 02083763-bbaf-05ea-abc5-000000001744 30583 1726853739.45122: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I can take a profile down that is absent ########## 30583 1726853739.45200: no more pending results, returning what we have 30583 1726853739.45205: results queue empty 30583 1726853739.45206: checking for any_errors_fatal 30583 1726853739.45208: done checking for any_errors_fatal 30583 1726853739.45208: checking for max_fail_percentage 30583 1726853739.45210: done checking for max_fail_percentage 30583 1726853739.45211: checking to see if all hosts have failed and the running result is not ok 30583 1726853739.45212: done checking to see if all hosts have failed 30583 1726853739.45212: getting the remaining hosts for this loop 30583 1726853739.45215: done getting the remaining hosts for this loop 30583 1726853739.45218: getting the next task for host managed_node2 30583 1726853739.45225: done getting next task for host managed_node2 30583 1726853739.45227: ^ task is: TASK: Show item 30583 1726853739.45230: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853739.45233: getting variables 30583 1726853739.45234: in VariableManager get_vars() 30583 1726853739.45279: Calling all_inventory to load vars for managed_node2 30583 1726853739.45282: Calling groups_inventory to load vars for managed_node2 30583 1726853739.45285: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853739.45294: Calling all_plugins_play to load vars for managed_node2 30583 1726853739.45297: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853739.45299: Calling groups_plugins_play to load vars for managed_node2 30583 1726853739.47394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853739.48953: done with get_vars() 30583 1726853739.48986: done getting variables 30583 1726853739.49052: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 13:35:39 -0400 (0:00:00.068) 0:01:14.828 ****** 30583 1726853739.49090: entering _queue_task() for managed_node2/debug 30583 1726853739.49487: worker is 1 (out of 1 available) 30583 1726853739.49500: exiting _queue_task() for managed_node2/debug 30583 1726853739.49514: done queuing things up, now waiting for results queue to drain 30583 1726853739.49515: waiting for pending results... 30583 1726853739.49752: running TaskExecutor() for managed_node2/TASK: Show item 30583 1726853739.49896: in run() - task 02083763-bbaf-05ea-abc5-000000001745 30583 1726853739.50077: variable 'ansible_search_path' from source: unknown 30583 1726853739.50080: variable 'ansible_search_path' from source: unknown 30583 1726853739.50083: variable 'omit' from source: magic vars 30583 1726853739.50140: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.50153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.50169: variable 'omit' from source: magic vars 30583 1726853739.50596: variable 'ansible_distribution_major_version' from source: facts 30583 1726853739.50608: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853739.50615: variable 'omit' from source: magic vars 30583 1726853739.50651: variable 'omit' from source: magic vars 30583 1726853739.50702: variable 'item' from source: unknown 30583 1726853739.50776: variable 'item' from source: unknown 30583 1726853739.50795: variable 'omit' from source: magic vars 30583 1726853739.50836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853739.50877: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853739.50898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853739.50915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.50927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.50960: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853739.50964: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.50972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.51187: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853739.51190: Set connection var ansible_timeout to 10 30583 1726853739.51193: Set connection var ansible_connection to ssh 30583 1726853739.51195: Set connection var ansible_shell_executable to /bin/sh 30583 1726853739.51198: Set connection var ansible_shell_type to sh 30583 1726853739.51201: Set connection var ansible_pipelining to False 30583 1726853739.51205: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.51208: variable 'ansible_connection' from source: unknown 30583 1726853739.51211: variable 'ansible_module_compression' from source: unknown 30583 1726853739.51213: variable 'ansible_shell_type' from source: unknown 30583 1726853739.51216: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.51219: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.51221: variable 'ansible_pipelining' from source: unknown 30583 1726853739.51223: variable 'ansible_timeout' from source: unknown 30583 1726853739.51227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.51322: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853739.51334: variable 'omit' from source: magic vars 30583 1726853739.51343: starting attempt loop 30583 1726853739.51346: running the handler 30583 1726853739.51478: variable 'lsr_description' from source: include params 30583 1726853739.51483: variable 'lsr_description' from source: include params 30583 1726853739.51487: handler run complete 30583 1726853739.51509: attempt loop complete, returning result 30583 1726853739.51525: variable 'item' from source: unknown 30583 1726853739.51583: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can take a profile down that is absent" } 30583 1726853739.51749: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.51753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.51756: variable 'omit' from source: magic vars 30583 1726853739.52094: variable 'ansible_distribution_major_version' from source: facts 30583 1726853739.52098: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853739.52100: variable 'omit' from source: magic vars 30583 1726853739.52102: variable 'omit' from source: magic vars 30583 1726853739.52105: variable 'item' from source: unknown 30583 1726853739.52107: variable 'item' from source: unknown 30583 1726853739.52109: variable 'omit' from source: magic vars 30583 1726853739.52111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853739.52113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.52116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.52118: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853739.52120: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.52122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.52135: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853739.52140: Set connection var ansible_timeout to 10 30583 1726853739.52143: Set connection var ansible_connection to ssh 30583 1726853739.52153: Set connection var ansible_shell_executable to /bin/sh 30583 1726853739.52156: Set connection var ansible_shell_type to sh 30583 1726853739.52160: Set connection var ansible_pipelining to False 30583 1726853739.52185: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.52188: variable 'ansible_connection' from source: unknown 30583 1726853739.52192: variable 'ansible_module_compression' from source: unknown 30583 1726853739.52208: variable 'ansible_shell_type' from source: unknown 30583 1726853739.52211: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.52214: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.52216: variable 'ansible_pipelining' from source: unknown 30583 1726853739.52218: variable 'ansible_timeout' from source: unknown 30583 1726853739.52220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.52303: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853739.52317: variable 'omit' from source: magic vars 30583 1726853739.52320: starting attempt loop 30583 1726853739.52322: running the handler 30583 1726853739.52339: variable 'lsr_setup' from source: include params 30583 1726853739.52407: variable 'lsr_setup' from source: include params 30583 1726853739.52452: handler run complete 30583 1726853739.52467: attempt loop complete, returning result 30583 1726853739.52487: variable 'item' from source: unknown 30583 1726853739.52540: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove_profile.yml" ] } 30583 1726853739.52629: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.52632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.52642: variable 'omit' from source: magic vars 30583 1726853739.52794: variable 'ansible_distribution_major_version' from source: facts 30583 1726853739.52803: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853739.52808: variable 'omit' from source: magic vars 30583 1726853739.52822: variable 'omit' from source: magic vars 30583 1726853739.52863: variable 'item' from source: unknown 30583 1726853739.52927: variable 'item' from source: unknown 30583 1726853739.52940: variable 'omit' from source: magic vars 30583 1726853739.52957: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853739.52968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.52979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.52989: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853739.52992: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.52995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.53066: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853739.53076: Set connection var ansible_timeout to 10 30583 1726853739.53079: Set connection var ansible_connection to ssh 30583 1726853739.53082: Set connection var ansible_shell_executable to /bin/sh 30583 1726853739.53089: Set connection var ansible_shell_type to sh 30583 1726853739.53185: Set connection var ansible_pipelining to False 30583 1726853739.53189: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.53191: variable 'ansible_connection' from source: unknown 30583 1726853739.53209: variable 'ansible_module_compression' from source: unknown 30583 1726853739.53212: variable 'ansible_shell_type' from source: unknown 30583 1726853739.53214: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.53217: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.53219: variable 'ansible_pipelining' from source: unknown 30583 1726853739.53221: variable 'ansible_timeout' from source: unknown 30583 1726853739.53223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.53229: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853739.53241: variable 'omit' from source: magic vars 30583 1726853739.53244: starting attempt loop 30583 1726853739.53246: running the handler 30583 1726853739.53265: variable 'lsr_test' from source: include params 30583 1726853739.53330: variable 'lsr_test' from source: include params 30583 1726853739.53351: handler run complete 30583 1726853739.53363: attempt loop complete, returning result 30583 1726853739.53516: variable 'item' from source: unknown 30583 1726853739.53524: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 30583 1726853739.53591: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.53594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.53597: variable 'omit' from source: magic vars 30583 1726853739.53689: variable 'ansible_distribution_major_version' from source: facts 30583 1726853739.53693: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853739.53698: variable 'omit' from source: magic vars 30583 1726853739.53711: variable 'omit' from source: magic vars 30583 1726853739.53748: variable 'item' from source: unknown 30583 1726853739.53808: variable 'item' from source: unknown 30583 1726853739.53838: variable 'omit' from source: magic vars 30583 1726853739.53841: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853739.53844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.53854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.53880: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853739.53883: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.53885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.53937: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853739.54055: Set connection var ansible_timeout to 10 30583 1726853739.54061: Set connection var ansible_connection to ssh 30583 1726853739.54065: Set connection var ansible_shell_executable to /bin/sh 30583 1726853739.54068: Set connection var ansible_shell_type to sh 30583 1726853739.54073: Set connection var ansible_pipelining to False 30583 1726853739.54076: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.54079: variable 'ansible_connection' from source: unknown 30583 1726853739.54081: variable 'ansible_module_compression' from source: unknown 30583 1726853739.54084: variable 'ansible_shell_type' from source: unknown 30583 1726853739.54087: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.54089: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.54092: variable 'ansible_pipelining' from source: unknown 30583 1726853739.54094: variable 'ansible_timeout' from source: unknown 30583 1726853739.54097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.54108: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853739.54116: variable 'omit' from source: magic vars 30583 1726853739.54119: starting attempt loop 30583 1726853739.54121: running the handler 30583 1726853739.54140: variable 'lsr_assert' from source: include params 30583 1726853739.54198: variable 'lsr_assert' from source: include params 30583 1726853739.54221: handler run complete 30583 1726853739.54236: attempt loop complete, returning result 30583 1726853739.54247: variable 'item' from source: unknown 30583 1726853739.54304: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml" ] } 30583 1726853739.54445: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.54449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.54452: variable 'omit' from source: magic vars 30583 1726853739.54606: variable 'ansible_distribution_major_version' from source: facts 30583 1726853739.54609: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853739.54615: variable 'omit' from source: magic vars 30583 1726853739.54622: variable 'omit' from source: magic vars 30583 1726853739.54648: variable 'item' from source: unknown 30583 1726853739.54692: variable 'item' from source: unknown 30583 1726853739.54707: variable 'omit' from source: magic vars 30583 1726853739.54718: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853739.54723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.54730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.54738: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853739.54740: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.54743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.54793: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853739.54796: Set connection var ansible_timeout to 10 30583 1726853739.54799: Set connection var ansible_connection to ssh 30583 1726853739.54804: Set connection var ansible_shell_executable to /bin/sh 30583 1726853739.54806: Set connection var ansible_shell_type to sh 30583 1726853739.54815: Set connection var ansible_pipelining to False 30583 1726853739.54829: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.54832: variable 'ansible_connection' from source: unknown 30583 1726853739.54834: variable 'ansible_module_compression' from source: unknown 30583 1726853739.54837: variable 'ansible_shell_type' from source: unknown 30583 1726853739.54839: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.54841: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.54845: variable 'ansible_pipelining' from source: unknown 30583 1726853739.54847: variable 'ansible_timeout' from source: unknown 30583 1726853739.54851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.54910: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853739.54916: variable 'omit' from source: magic vars 30583 1726853739.54920: starting attempt loop 30583 1726853739.54923: running the handler 30583 1726853739.54938: variable 'lsr_assert_when' from source: include params 30583 1726853739.54983: variable 'lsr_assert_when' from source: include params 30583 1726853739.55042: variable 'network_provider' from source: set_fact 30583 1726853739.55066: handler run complete 30583 1726853739.55078: attempt loop complete, returning result 30583 1726853739.55090: variable 'item' from source: unknown 30583 1726853739.55131: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 30583 1726853739.55209: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.55213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.55216: variable 'omit' from source: magic vars 30583 1726853739.55306: variable 'ansible_distribution_major_version' from source: facts 30583 1726853739.55310: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853739.55314: variable 'omit' from source: magic vars 30583 1726853739.55324: variable 'omit' from source: magic vars 30583 1726853739.55352: variable 'item' from source: unknown 30583 1726853739.55395: variable 'item' from source: unknown 30583 1726853739.55405: variable 'omit' from source: magic vars 30583 1726853739.55418: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853739.55424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.55429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.55441: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853739.55447: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.55449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.55491: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853739.55494: Set connection var ansible_timeout to 10 30583 1726853739.55496: Set connection var ansible_connection to ssh 30583 1726853739.55502: Set connection var ansible_shell_executable to /bin/sh 30583 1726853739.55504: Set connection var ansible_shell_type to sh 30583 1726853739.55511: Set connection var ansible_pipelining to False 30583 1726853739.55526: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.55528: variable 'ansible_connection' from source: unknown 30583 1726853739.55531: variable 'ansible_module_compression' from source: unknown 30583 1726853739.55533: variable 'ansible_shell_type' from source: unknown 30583 1726853739.55535: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.55537: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.55541: variable 'ansible_pipelining' from source: unknown 30583 1726853739.55548: variable 'ansible_timeout' from source: unknown 30583 1726853739.55552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.55606: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853739.55612: variable 'omit' from source: magic vars 30583 1726853739.55615: starting attempt loop 30583 1726853739.55617: running the handler 30583 1726853739.55632: variable 'lsr_fail_debug' from source: play vars 30583 1726853739.55678: variable 'lsr_fail_debug' from source: play vars 30583 1726853739.55690: handler run complete 30583 1726853739.55699: attempt loop complete, returning result 30583 1726853739.55710: variable 'item' from source: unknown 30583 1726853739.55751: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30583 1726853739.55824: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.55827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.55830: variable 'omit' from source: magic vars 30583 1726853739.55922: variable 'ansible_distribution_major_version' from source: facts 30583 1726853739.55926: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853739.55930: variable 'omit' from source: magic vars 30583 1726853739.55945: variable 'omit' from source: magic vars 30583 1726853739.55967: variable 'item' from source: unknown 30583 1726853739.56011: variable 'item' from source: unknown 30583 1726853739.56022: variable 'omit' from source: magic vars 30583 1726853739.56035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853739.56042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.56045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.56056: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853739.56062: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.56064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.56106: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853739.56110: Set connection var ansible_timeout to 10 30583 1726853739.56112: Set connection var ansible_connection to ssh 30583 1726853739.56117: Set connection var ansible_shell_executable to /bin/sh 30583 1726853739.56120: Set connection var ansible_shell_type to sh 30583 1726853739.56127: Set connection var ansible_pipelining to False 30583 1726853739.56140: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.56143: variable 'ansible_connection' from source: unknown 30583 1726853739.56145: variable 'ansible_module_compression' from source: unknown 30583 1726853739.56148: variable 'ansible_shell_type' from source: unknown 30583 1726853739.56150: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.56152: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.56160: variable 'ansible_pipelining' from source: unknown 30583 1726853739.56162: variable 'ansible_timeout' from source: unknown 30583 1726853739.56164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.56219: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853739.56225: variable 'omit' from source: magic vars 30583 1726853739.56228: starting attempt loop 30583 1726853739.56230: running the handler 30583 1726853739.56244: variable 'lsr_cleanup' from source: include params 30583 1726853739.56301: variable 'lsr_cleanup' from source: include params 30583 1726853739.56312: handler run complete 30583 1726853739.56321: attempt loop complete, returning result 30583 1726853739.56332: variable 'item' from source: unknown 30583 1726853739.56453: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30583 1726853739.56630: dumping result to json 30583 1726853739.56633: done dumping result, returning 30583 1726853739.56636: done running TaskExecutor() for managed_node2/TASK: Show item [02083763-bbaf-05ea-abc5-000000001745] 30583 1726853739.56638: sending task result for task 02083763-bbaf-05ea-abc5-000000001745 30583 1726853739.56683: done sending task result for task 02083763-bbaf-05ea-abc5-000000001745 30583 1726853739.56686: WORKER PROCESS EXITING 30583 1726853739.56832: no more pending results, returning what we have 30583 1726853739.56835: results queue empty 30583 1726853739.56836: checking for any_errors_fatal 30583 1726853739.56842: done checking for any_errors_fatal 30583 1726853739.56843: checking for max_fail_percentage 30583 1726853739.56844: done checking for max_fail_percentage 30583 1726853739.56845: checking to see if all hosts have failed and the running result is not ok 30583 1726853739.56846: done checking to see if all hosts have failed 30583 1726853739.56847: getting the remaining hosts for this loop 30583 1726853739.56848: done getting the remaining hosts for this loop 30583 1726853739.56851: getting the next task for host managed_node2 30583 1726853739.56857: done getting next task for host managed_node2 30583 1726853739.56862: ^ task is: TASK: Include the task 'show_interfaces.yml' 30583 1726853739.56864: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853739.56868: getting variables 30583 1726853739.56869: in VariableManager get_vars() 30583 1726853739.56903: Calling all_inventory to load vars for managed_node2 30583 1726853739.56906: Calling groups_inventory to load vars for managed_node2 30583 1726853739.56909: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853739.56919: Calling all_plugins_play to load vars for managed_node2 30583 1726853739.56923: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853739.56925: Calling groups_plugins_play to load vars for managed_node2 30583 1726853739.58136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853739.58982: done with get_vars() 30583 1726853739.59000: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 13:35:39 -0400 (0:00:00.099) 0:01:14.927 ****** 30583 1726853739.59066: entering _queue_task() for managed_node2/include_tasks 30583 1726853739.59325: worker is 1 (out of 1 available) 30583 1726853739.59339: exiting _queue_task() for managed_node2/include_tasks 30583 1726853739.59352: done queuing things up, now waiting for results queue to drain 30583 1726853739.59353: waiting for pending results... 30583 1726853739.59689: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 30583 1726853739.59725: in run() - task 02083763-bbaf-05ea-abc5-000000001746 30583 1726853739.59755: variable 'ansible_search_path' from source: unknown 30583 1726853739.59811: variable 'ansible_search_path' from source: unknown 30583 1726853739.59816: calling self._execute() 30583 1726853739.59935: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.59947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.59964: variable 'omit' from source: magic vars 30583 1726853739.60407: variable 'ansible_distribution_major_version' from source: facts 30583 1726853739.60425: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853739.60468: _execute() done 30583 1726853739.60473: dumping result to json 30583 1726853739.60476: done dumping result, returning 30583 1726853739.60479: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-05ea-abc5-000000001746] 30583 1726853739.60481: sending task result for task 02083763-bbaf-05ea-abc5-000000001746 30583 1726853739.60708: no more pending results, returning what we have 30583 1726853739.60717: in VariableManager get_vars() 30583 1726853739.60766: Calling all_inventory to load vars for managed_node2 30583 1726853739.60769: Calling groups_inventory to load vars for managed_node2 30583 1726853739.60774: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853739.60813: Calling all_plugins_play to load vars for managed_node2 30583 1726853739.60817: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853739.60822: done sending task result for task 02083763-bbaf-05ea-abc5-000000001746 30583 1726853739.60824: WORKER PROCESS EXITING 30583 1726853739.60828: Calling groups_plugins_play to load vars for managed_node2 30583 1726853739.61666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853739.62573: done with get_vars() 30583 1726853739.62588: variable 'ansible_search_path' from source: unknown 30583 1726853739.62589: variable 'ansible_search_path' from source: unknown 30583 1726853739.62627: we have included files to process 30583 1726853739.62628: generating all_blocks data 30583 1726853739.62629: done generating all_blocks data 30583 1726853739.62632: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853739.62633: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853739.62637: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853739.62743: in VariableManager get_vars() 30583 1726853739.62766: done with get_vars() 30583 1726853739.62873: done processing included file 30583 1726853739.62875: iterating over new_blocks loaded from include file 30583 1726853739.62876: in VariableManager get_vars() 30583 1726853739.62890: done with get_vars() 30583 1726853739.62892: filtering new block on tags 30583 1726853739.62922: done filtering new block on tags 30583 1726853739.62925: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 30583 1726853739.62929: extending task lists for all hosts with included blocks 30583 1726853739.63365: done extending task lists 30583 1726853739.63366: done processing included files 30583 1726853739.63367: results queue empty 30583 1726853739.63367: checking for any_errors_fatal 30583 1726853739.63374: done checking for any_errors_fatal 30583 1726853739.63374: checking for max_fail_percentage 30583 1726853739.63375: done checking for max_fail_percentage 30583 1726853739.63375: checking to see if all hosts have failed and the running result is not ok 30583 1726853739.63376: done checking to see if all hosts have failed 30583 1726853739.63376: getting the remaining hosts for this loop 30583 1726853739.63377: done getting the remaining hosts for this loop 30583 1726853739.63379: getting the next task for host managed_node2 30583 1726853739.63382: done getting next task for host managed_node2 30583 1726853739.63384: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30583 1726853739.63387: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853739.63390: getting variables 30583 1726853739.63391: in VariableManager get_vars() 30583 1726853739.63404: Calling all_inventory to load vars for managed_node2 30583 1726853739.63406: Calling groups_inventory to load vars for managed_node2 30583 1726853739.63408: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853739.63413: Calling all_plugins_play to load vars for managed_node2 30583 1726853739.63415: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853739.63416: Calling groups_plugins_play to load vars for managed_node2 30583 1726853739.64640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853739.65490: done with get_vars() 30583 1726853739.65509: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:35:39 -0400 (0:00:00.064) 0:01:14.992 ****** 30583 1726853739.65562: entering _queue_task() for managed_node2/include_tasks 30583 1726853739.65818: worker is 1 (out of 1 available) 30583 1726853739.65833: exiting _queue_task() for managed_node2/include_tasks 30583 1726853739.65847: done queuing things up, now waiting for results queue to drain 30583 1726853739.65848: waiting for pending results... 30583 1726853739.66037: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 30583 1726853739.66128: in run() - task 02083763-bbaf-05ea-abc5-00000000176d 30583 1726853739.66138: variable 'ansible_search_path' from source: unknown 30583 1726853739.66141: variable 'ansible_search_path' from source: unknown 30583 1726853739.66175: calling self._execute() 30583 1726853739.66253: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.66257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.66267: variable 'omit' from source: magic vars 30583 1726853739.66691: variable 'ansible_distribution_major_version' from source: facts 30583 1726853739.66696: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853739.66699: _execute() done 30583 1726853739.66702: dumping result to json 30583 1726853739.66704: done dumping result, returning 30583 1726853739.66707: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-05ea-abc5-00000000176d] 30583 1726853739.66709: sending task result for task 02083763-bbaf-05ea-abc5-00000000176d 30583 1726853739.66845: no more pending results, returning what we have 30583 1726853739.66850: in VariableManager get_vars() 30583 1726853739.66953: Calling all_inventory to load vars for managed_node2 30583 1726853739.66956: Calling groups_inventory to load vars for managed_node2 30583 1726853739.66961: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853739.66976: Calling all_plugins_play to load vars for managed_node2 30583 1726853739.66980: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853739.66983: Calling groups_plugins_play to load vars for managed_node2 30583 1726853739.67688: done sending task result for task 02083763-bbaf-05ea-abc5-00000000176d 30583 1726853739.67691: WORKER PROCESS EXITING 30583 1726853739.68115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853739.69086: done with get_vars() 30583 1726853739.69101: variable 'ansible_search_path' from source: unknown 30583 1726853739.69102: variable 'ansible_search_path' from source: unknown 30583 1726853739.69126: we have included files to process 30583 1726853739.69127: generating all_blocks data 30583 1726853739.69128: done generating all_blocks data 30583 1726853739.69129: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853739.69129: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853739.69131: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853739.69310: done processing included file 30583 1726853739.69311: iterating over new_blocks loaded from include file 30583 1726853739.69313: in VariableManager get_vars() 30583 1726853739.69323: done with get_vars() 30583 1726853739.69324: filtering new block on tags 30583 1726853739.69346: done filtering new block on tags 30583 1726853739.69347: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 30583 1726853739.69351: extending task lists for all hosts with included blocks 30583 1726853739.69444: done extending task lists 30583 1726853739.69445: done processing included files 30583 1726853739.69445: results queue empty 30583 1726853739.69446: checking for any_errors_fatal 30583 1726853739.69448: done checking for any_errors_fatal 30583 1726853739.69448: checking for max_fail_percentage 30583 1726853739.69449: done checking for max_fail_percentage 30583 1726853739.69450: checking to see if all hosts have failed and the running result is not ok 30583 1726853739.69450: done checking to see if all hosts have failed 30583 1726853739.69451: getting the remaining hosts for this loop 30583 1726853739.69452: done getting the remaining hosts for this loop 30583 1726853739.69453: getting the next task for host managed_node2 30583 1726853739.69456: done getting next task for host managed_node2 30583 1726853739.69458: ^ task is: TASK: Gather current interface info 30583 1726853739.69461: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853739.69463: getting variables 30583 1726853739.69463: in VariableManager get_vars() 30583 1726853739.69473: Calling all_inventory to load vars for managed_node2 30583 1726853739.69475: Calling groups_inventory to load vars for managed_node2 30583 1726853739.69477: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853739.69480: Calling all_plugins_play to load vars for managed_node2 30583 1726853739.69482: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853739.69483: Calling groups_plugins_play to load vars for managed_node2 30583 1726853739.70110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853739.70948: done with get_vars() 30583 1726853739.70965: done getting variables 30583 1726853739.70999: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:35:39 -0400 (0:00:00.054) 0:01:15.047 ****** 30583 1726853739.71022: entering _queue_task() for managed_node2/command 30583 1726853739.71284: worker is 1 (out of 1 available) 30583 1726853739.71296: exiting _queue_task() for managed_node2/command 30583 1726853739.71309: done queuing things up, now waiting for results queue to drain 30583 1726853739.71310: waiting for pending results... 30583 1726853739.71502: running TaskExecutor() for managed_node2/TASK: Gather current interface info 30583 1726853739.71582: in run() - task 02083763-bbaf-05ea-abc5-0000000017a8 30583 1726853739.71594: variable 'ansible_search_path' from source: unknown 30583 1726853739.71599: variable 'ansible_search_path' from source: unknown 30583 1726853739.71626: calling self._execute() 30583 1726853739.71707: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.71711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.71720: variable 'omit' from source: magic vars 30583 1726853739.72020: variable 'ansible_distribution_major_version' from source: facts 30583 1726853739.72030: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853739.72036: variable 'omit' from source: magic vars 30583 1726853739.72073: variable 'omit' from source: magic vars 30583 1726853739.72100: variable 'omit' from source: magic vars 30583 1726853739.72133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853739.72161: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853739.72185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853739.72201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.72209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853739.72233: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853739.72237: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.72239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.72313: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853739.72318: Set connection var ansible_timeout to 10 30583 1726853739.72321: Set connection var ansible_connection to ssh 30583 1726853739.72326: Set connection var ansible_shell_executable to /bin/sh 30583 1726853739.72329: Set connection var ansible_shell_type to sh 30583 1726853739.72336: Set connection var ansible_pipelining to False 30583 1726853739.72353: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.72356: variable 'ansible_connection' from source: unknown 30583 1726853739.72360: variable 'ansible_module_compression' from source: unknown 30583 1726853739.72364: variable 'ansible_shell_type' from source: unknown 30583 1726853739.72367: variable 'ansible_shell_executable' from source: unknown 30583 1726853739.72369: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853739.72374: variable 'ansible_pipelining' from source: unknown 30583 1726853739.72377: variable 'ansible_timeout' from source: unknown 30583 1726853739.72381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853739.72482: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853739.72491: variable 'omit' from source: magic vars 30583 1726853739.72496: starting attempt loop 30583 1726853739.72499: running the handler 30583 1726853739.72512: _low_level_execute_command(): starting 30583 1726853739.72519: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853739.73046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853739.73050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853739.73053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853739.73055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853739.73060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853739.73093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853739.73106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853739.73197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853739.74955: stdout chunk (state=3): >>>/root <<< 30583 1726853739.75061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853739.75089: stderr chunk (state=3): >>><<< 30583 1726853739.75093: stdout chunk (state=3): >>><<< 30583 1726853739.75116: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853739.75126: _low_level_execute_command(): starting 30583 1726853739.75131: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373 `" && echo ansible-tmp-1726853739.751141-34109-183998476232373="` echo /root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373 `" ) && sleep 0' 30583 1726853739.75556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853739.75561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853739.75565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853739.75577: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853739.75580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853739.75622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853739.75628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853739.75633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853739.75698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853739.77693: stdout chunk (state=3): >>>ansible-tmp-1726853739.751141-34109-183998476232373=/root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373 <<< 30583 1726853739.77804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853739.77828: stderr chunk (state=3): >>><<< 30583 1726853739.77831: stdout chunk (state=3): >>><<< 30583 1726853739.77847: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853739.751141-34109-183998476232373=/root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853739.77875: variable 'ansible_module_compression' from source: unknown 30583 1726853739.77917: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853739.77948: variable 'ansible_facts' from source: unknown 30583 1726853739.78008: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373/AnsiballZ_command.py 30583 1726853739.78181: Sending initial data 30583 1726853739.78184: Sent initial data (155 bytes) 30583 1726853739.78855: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853739.78880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853739.78980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853739.80629: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853739.80633: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853739.80701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853739.80777: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpzloe1ta5 /root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373/AnsiballZ_command.py <<< 30583 1726853739.80780: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373/AnsiballZ_command.py" <<< 30583 1726853739.80838: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpzloe1ta5" to remote "/root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373/AnsiballZ_command.py" <<< 30583 1726853739.80842: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373/AnsiballZ_command.py" <<< 30583 1726853739.81950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853739.81953: stderr chunk (state=3): >>><<< 30583 1726853739.81956: stdout chunk (state=3): >>><<< 30583 1726853739.81960: done transferring module to remote 30583 1726853739.81962: _low_level_execute_command(): starting 30583 1726853739.81964: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373/ /root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373/AnsiballZ_command.py && sleep 0' 30583 1726853739.82548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853739.82565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853739.82612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853739.82624: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853739.82644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853739.82732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853739.82747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853739.82846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853739.84977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853739.84981: stdout chunk (state=3): >>><<< 30583 1726853739.84984: stderr chunk (state=3): >>><<< 30583 1726853739.84987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853739.84990: _low_level_execute_command(): starting 30583 1726853739.84993: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373/AnsiballZ_command.py && sleep 0' 30583 1726853739.85479: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853739.85488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853739.85499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853739.85519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853739.85525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853739.85556: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853739.85585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853739.85657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853739.85665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853739.85706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853739.85792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853740.01641: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:35:40.011735", "end": "2024-09-20 13:35:40.015264", "delta": "0:00:00.003529", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853740.03373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853740.03378: stdout chunk (state=3): >>><<< 30583 1726853740.03380: stderr chunk (state=3): >>><<< 30583 1726853740.03400: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:35:40.011735", "end": "2024-09-20 13:35:40.015264", "delta": "0:00:00.003529", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853740.03444: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853740.03479: _low_level_execute_command(): starting 30583 1726853740.03482: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853739.751141-34109-183998476232373/ > /dev/null 2>&1 && sleep 0' 30583 1726853740.04098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853740.04189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853740.04224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853740.04240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853740.04260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853740.04355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853740.06331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853740.06335: stdout chunk (state=3): >>><<< 30583 1726853740.06376: stderr chunk (state=3): >>><<< 30583 1726853740.06380: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853740.06383: handler run complete 30583 1726853740.06395: Evaluated conditional (False): False 30583 1726853740.06408: attempt loop complete, returning result 30583 1726853740.06411: _execute() done 30583 1726853740.06413: dumping result to json 30583 1726853740.06418: done dumping result, returning 30583 1726853740.06428: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [02083763-bbaf-05ea-abc5-0000000017a8] 30583 1726853740.06430: sending task result for task 02083763-bbaf-05ea-abc5-0000000017a8 30583 1726853740.06642: done sending task result for task 02083763-bbaf-05ea-abc5-0000000017a8 30583 1726853740.06645: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003529", "end": "2024-09-20 13:35:40.015264", "rc": 0, "start": "2024-09-20 13:35:40.011735" } STDOUT: bonding_masters eth0 lo 30583 1726853740.06739: no more pending results, returning what we have 30583 1726853740.06745: results queue empty 30583 1726853740.06746: checking for any_errors_fatal 30583 1726853740.06748: done checking for any_errors_fatal 30583 1726853740.06749: checking for max_fail_percentage 30583 1726853740.06752: done checking for max_fail_percentage 30583 1726853740.06753: checking to see if all hosts have failed and the running result is not ok 30583 1726853740.06753: done checking to see if all hosts have failed 30583 1726853740.06754: getting the remaining hosts for this loop 30583 1726853740.06756: done getting the remaining hosts for this loop 30583 1726853740.06763: getting the next task for host managed_node2 30583 1726853740.06775: done getting next task for host managed_node2 30583 1726853740.06778: ^ task is: TASK: Set current_interfaces 30583 1726853740.06784: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853740.06791: getting variables 30583 1726853740.06793: in VariableManager get_vars() 30583 1726853740.06946: Calling all_inventory to load vars for managed_node2 30583 1726853740.06949: Calling groups_inventory to load vars for managed_node2 30583 1726853740.06952: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853740.06965: Calling all_plugins_play to load vars for managed_node2 30583 1726853740.06968: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853740.07020: Calling groups_plugins_play to load vars for managed_node2 30583 1726853740.08609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853740.10294: done with get_vars() 30583 1726853740.10323: done getting variables 30583 1726853740.10387: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:35:40 -0400 (0:00:00.394) 0:01:15.441 ****** 30583 1726853740.10428: entering _queue_task() for managed_node2/set_fact 30583 1726853740.10812: worker is 1 (out of 1 available) 30583 1726853740.10824: exiting _queue_task() for managed_node2/set_fact 30583 1726853740.10845: done queuing things up, now waiting for results queue to drain 30583 1726853740.10847: waiting for pending results... 30583 1726853740.11188: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 30583 1726853740.11477: in run() - task 02083763-bbaf-05ea-abc5-0000000017a9 30583 1726853740.11482: variable 'ansible_search_path' from source: unknown 30583 1726853740.11485: variable 'ansible_search_path' from source: unknown 30583 1726853740.11488: calling self._execute() 30583 1726853740.11491: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.11495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.11499: variable 'omit' from source: magic vars 30583 1726853740.11912: variable 'ansible_distribution_major_version' from source: facts 30583 1726853740.11933: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853740.11945: variable 'omit' from source: magic vars 30583 1726853740.11998: variable 'omit' from source: magic vars 30583 1726853740.12116: variable '_current_interfaces' from source: set_fact 30583 1726853740.12195: variable 'omit' from source: magic vars 30583 1726853740.12238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853740.12288: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853740.12307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853740.12325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853740.12340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853740.12384: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853740.12387: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.12389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.12506: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853740.12512: Set connection var ansible_timeout to 10 30583 1726853740.12515: Set connection var ansible_connection to ssh 30583 1726853740.12521: Set connection var ansible_shell_executable to /bin/sh 30583 1726853740.12524: Set connection var ansible_shell_type to sh 30583 1726853740.12533: Set connection var ansible_pipelining to False 30583 1726853740.12557: variable 'ansible_shell_executable' from source: unknown 30583 1726853740.12560: variable 'ansible_connection' from source: unknown 30583 1726853740.12565: variable 'ansible_module_compression' from source: unknown 30583 1726853740.12568: variable 'ansible_shell_type' from source: unknown 30583 1726853740.12579: variable 'ansible_shell_executable' from source: unknown 30583 1726853740.12582: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.12586: variable 'ansible_pipelining' from source: unknown 30583 1726853740.12595: variable 'ansible_timeout' from source: unknown 30583 1726853740.12775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.12780: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853740.12783: variable 'omit' from source: magic vars 30583 1726853740.12785: starting attempt loop 30583 1726853740.12787: running the handler 30583 1726853740.12789: handler run complete 30583 1726853740.12791: attempt loop complete, returning result 30583 1726853740.12797: _execute() done 30583 1726853740.12799: dumping result to json 30583 1726853740.12803: done dumping result, returning 30583 1726853740.12817: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [02083763-bbaf-05ea-abc5-0000000017a9] 30583 1726853740.12820: sending task result for task 02083763-bbaf-05ea-abc5-0000000017a9 30583 1726853740.12907: done sending task result for task 02083763-bbaf-05ea-abc5-0000000017a9 30583 1726853740.12910: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30583 1726853740.12978: no more pending results, returning what we have 30583 1726853740.12982: results queue empty 30583 1726853740.12983: checking for any_errors_fatal 30583 1726853740.12994: done checking for any_errors_fatal 30583 1726853740.12995: checking for max_fail_percentage 30583 1726853740.12997: done checking for max_fail_percentage 30583 1726853740.12998: checking to see if all hosts have failed and the running result is not ok 30583 1726853740.12999: done checking to see if all hosts have failed 30583 1726853740.13000: getting the remaining hosts for this loop 30583 1726853740.13002: done getting the remaining hosts for this loop 30583 1726853740.13007: getting the next task for host managed_node2 30583 1726853740.13018: done getting next task for host managed_node2 30583 1726853740.13021: ^ task is: TASK: Show current_interfaces 30583 1726853740.13025: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853740.13029: getting variables 30583 1726853740.13031: in VariableManager get_vars() 30583 1726853740.13279: Calling all_inventory to load vars for managed_node2 30583 1726853740.13282: Calling groups_inventory to load vars for managed_node2 30583 1726853740.13286: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853740.13295: Calling all_plugins_play to load vars for managed_node2 30583 1726853740.13298: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853740.13301: Calling groups_plugins_play to load vars for managed_node2 30583 1726853740.14777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853740.16454: done with get_vars() 30583 1726853740.16486: done getting variables 30583 1726853740.16568: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:35:40 -0400 (0:00:00.061) 0:01:15.503 ****** 30583 1726853740.16604: entering _queue_task() for managed_node2/debug 30583 1726853740.16952: worker is 1 (out of 1 available) 30583 1726853740.16966: exiting _queue_task() for managed_node2/debug 30583 1726853740.17180: done queuing things up, now waiting for results queue to drain 30583 1726853740.17182: waiting for pending results... 30583 1726853740.17284: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 30583 1726853740.17419: in run() - task 02083763-bbaf-05ea-abc5-00000000176e 30583 1726853740.17431: variable 'ansible_search_path' from source: unknown 30583 1726853740.17435: variable 'ansible_search_path' from source: unknown 30583 1726853740.17468: calling self._execute() 30583 1726853740.17554: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.17558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.17570: variable 'omit' from source: magic vars 30583 1726853740.17855: variable 'ansible_distribution_major_version' from source: facts 30583 1726853740.17888: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853740.17892: variable 'omit' from source: magic vars 30583 1726853740.17915: variable 'omit' from source: magic vars 30583 1726853740.18254: variable 'current_interfaces' from source: set_fact 30583 1726853740.18258: variable 'omit' from source: magic vars 30583 1726853740.18261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853740.18299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853740.18320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853740.18338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853740.18350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853740.18387: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853740.18391: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.18467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.18505: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853740.18512: Set connection var ansible_timeout to 10 30583 1726853740.18515: Set connection var ansible_connection to ssh 30583 1726853740.18520: Set connection var ansible_shell_executable to /bin/sh 30583 1726853740.18523: Set connection var ansible_shell_type to sh 30583 1726853740.18576: Set connection var ansible_pipelining to False 30583 1726853740.18587: variable 'ansible_shell_executable' from source: unknown 30583 1726853740.18591: variable 'ansible_connection' from source: unknown 30583 1726853740.18593: variable 'ansible_module_compression' from source: unknown 30583 1726853740.18596: variable 'ansible_shell_type' from source: unknown 30583 1726853740.18598: variable 'ansible_shell_executable' from source: unknown 30583 1726853740.18600: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.18602: variable 'ansible_pipelining' from source: unknown 30583 1726853740.18603: variable 'ansible_timeout' from source: unknown 30583 1726853740.18606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.18804: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853740.18807: variable 'omit' from source: magic vars 30583 1726853740.18810: starting attempt loop 30583 1726853740.18812: running the handler 30583 1726853740.18815: handler run complete 30583 1726853740.18817: attempt loop complete, returning result 30583 1726853740.18819: _execute() done 30583 1726853740.18821: dumping result to json 30583 1726853740.18826: done dumping result, returning 30583 1726853740.18834: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [02083763-bbaf-05ea-abc5-00000000176e] 30583 1726853740.18837: sending task result for task 02083763-bbaf-05ea-abc5-00000000176e 30583 1726853740.18920: done sending task result for task 02083763-bbaf-05ea-abc5-00000000176e 30583 1726853740.18923: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30583 1726853740.18986: no more pending results, returning what we have 30583 1726853740.18989: results queue empty 30583 1726853740.18990: checking for any_errors_fatal 30583 1726853740.18997: done checking for any_errors_fatal 30583 1726853740.18997: checking for max_fail_percentage 30583 1726853740.18999: done checking for max_fail_percentage 30583 1726853740.19000: checking to see if all hosts have failed and the running result is not ok 30583 1726853740.19001: done checking to see if all hosts have failed 30583 1726853740.19002: getting the remaining hosts for this loop 30583 1726853740.19003: done getting the remaining hosts for this loop 30583 1726853740.19007: getting the next task for host managed_node2 30583 1726853740.19016: done getting next task for host managed_node2 30583 1726853740.19018: ^ task is: TASK: Setup 30583 1726853740.19021: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853740.19026: getting variables 30583 1726853740.19027: in VariableManager get_vars() 30583 1726853740.19075: Calling all_inventory to load vars for managed_node2 30583 1726853740.19079: Calling groups_inventory to load vars for managed_node2 30583 1726853740.19082: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853740.19092: Calling all_plugins_play to load vars for managed_node2 30583 1726853740.19095: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853740.19097: Calling groups_plugins_play to load vars for managed_node2 30583 1726853740.21138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853740.22816: done with get_vars() 30583 1726853740.22844: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 13:35:40 -0400 (0:00:00.065) 0:01:15.569 ****** 30583 1726853740.23193: entering _queue_task() for managed_node2/include_tasks 30583 1726853740.23823: worker is 1 (out of 1 available) 30583 1726853740.23839: exiting _queue_task() for managed_node2/include_tasks 30583 1726853740.23852: done queuing things up, now waiting for results queue to drain 30583 1726853740.23853: waiting for pending results... 30583 1726853740.24189: running TaskExecutor() for managed_node2/TASK: Setup 30583 1726853740.24194: in run() - task 02083763-bbaf-05ea-abc5-000000001747 30583 1726853740.24202: variable 'ansible_search_path' from source: unknown 30583 1726853740.24210: variable 'ansible_search_path' from source: unknown 30583 1726853740.24274: variable 'lsr_setup' from source: include params 30583 1726853740.24497: variable 'lsr_setup' from source: include params 30583 1726853740.24574: variable 'omit' from source: magic vars 30583 1726853740.24686: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.24693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.24702: variable 'omit' from source: magic vars 30583 1726853740.24876: variable 'ansible_distribution_major_version' from source: facts 30583 1726853740.24886: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853740.24892: variable 'item' from source: unknown 30583 1726853740.24937: variable 'item' from source: unknown 30583 1726853740.24957: variable 'item' from source: unknown 30583 1726853740.25006: variable 'item' from source: unknown 30583 1726853740.25129: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.25133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.25136: variable 'omit' from source: magic vars 30583 1726853740.25209: variable 'ansible_distribution_major_version' from source: facts 30583 1726853740.25212: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853740.25218: variable 'item' from source: unknown 30583 1726853740.25264: variable 'item' from source: unknown 30583 1726853740.25283: variable 'item' from source: unknown 30583 1726853740.25325: variable 'item' from source: unknown 30583 1726853740.25390: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.25393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.25403: variable 'omit' from source: magic vars 30583 1726853740.25499: variable 'ansible_distribution_major_version' from source: facts 30583 1726853740.25507: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853740.25515: variable 'item' from source: unknown 30583 1726853740.25551: variable 'item' from source: unknown 30583 1726853740.25576: variable 'item' from source: unknown 30583 1726853740.25620: variable 'item' from source: unknown 30583 1726853740.25680: dumping result to json 30583 1726853740.25682: done dumping result, returning 30583 1726853740.25684: done running TaskExecutor() for managed_node2/TASK: Setup [02083763-bbaf-05ea-abc5-000000001747] 30583 1726853740.25686: sending task result for task 02083763-bbaf-05ea-abc5-000000001747 30583 1726853740.25715: done sending task result for task 02083763-bbaf-05ea-abc5-000000001747 30583 1726853740.25718: WORKER PROCESS EXITING 30583 1726853740.25754: no more pending results, returning what we have 30583 1726853740.25761: in VariableManager get_vars() 30583 1726853740.25804: Calling all_inventory to load vars for managed_node2 30583 1726853740.25806: Calling groups_inventory to load vars for managed_node2 30583 1726853740.25810: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853740.25823: Calling all_plugins_play to load vars for managed_node2 30583 1726853740.25826: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853740.25830: Calling groups_plugins_play to load vars for managed_node2 30583 1726853740.27097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853740.28468: done with get_vars() 30583 1726853740.28487: variable 'ansible_search_path' from source: unknown 30583 1726853740.28488: variable 'ansible_search_path' from source: unknown 30583 1726853740.28516: variable 'ansible_search_path' from source: unknown 30583 1726853740.28517: variable 'ansible_search_path' from source: unknown 30583 1726853740.28533: variable 'ansible_search_path' from source: unknown 30583 1726853740.28534: variable 'ansible_search_path' from source: unknown 30583 1726853740.28549: we have included files to process 30583 1726853740.28550: generating all_blocks data 30583 1726853740.28551: done generating all_blocks data 30583 1726853740.28554: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853740.28555: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853740.28557: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853740.28724: done processing included file 30583 1726853740.28726: iterating over new_blocks loaded from include file 30583 1726853740.28727: in VariableManager get_vars() 30583 1726853740.28737: done with get_vars() 30583 1726853740.28738: filtering new block on tags 30583 1726853740.28762: done filtering new block on tags 30583 1726853740.28764: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 30583 1726853740.28767: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30583 1726853740.28768: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30583 1726853740.28770: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30583 1726853740.28829: done processing included file 30583 1726853740.28830: iterating over new_blocks loaded from include file 30583 1726853740.28831: in VariableManager get_vars() 30583 1726853740.28840: done with get_vars() 30583 1726853740.28841: filtering new block on tags 30583 1726853740.28854: done filtering new block on tags 30583 1726853740.28855: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node2 => (item=tasks/activate_profile.yml) 30583 1726853740.28860: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30583 1726853740.28860: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30583 1726853740.28862: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30583 1726853740.28919: done processing included file 30583 1726853740.28920: iterating over new_blocks loaded from include file 30583 1726853740.28921: in VariableManager get_vars() 30583 1726853740.28930: done with get_vars() 30583 1726853740.28931: filtering new block on tags 30583 1726853740.28943: done filtering new block on tags 30583 1726853740.28944: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed_node2 => (item=tasks/remove_profile.yml) 30583 1726853740.28946: extending task lists for all hosts with included blocks 30583 1726853740.29417: done extending task lists 30583 1726853740.29422: done processing included files 30583 1726853740.29422: results queue empty 30583 1726853740.29423: checking for any_errors_fatal 30583 1726853740.29425: done checking for any_errors_fatal 30583 1726853740.29426: checking for max_fail_percentage 30583 1726853740.29426: done checking for max_fail_percentage 30583 1726853740.29427: checking to see if all hosts have failed and the running result is not ok 30583 1726853740.29427: done checking to see if all hosts have failed 30583 1726853740.29428: getting the remaining hosts for this loop 30583 1726853740.29429: done getting the remaining hosts for this loop 30583 1726853740.29430: getting the next task for host managed_node2 30583 1726853740.29433: done getting next task for host managed_node2 30583 1726853740.29435: ^ task is: TASK: Include network role 30583 1726853740.29437: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853740.29439: getting variables 30583 1726853740.29440: in VariableManager get_vars() 30583 1726853740.29446: Calling all_inventory to load vars for managed_node2 30583 1726853740.29448: Calling groups_inventory to load vars for managed_node2 30583 1726853740.29450: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853740.29455: Calling all_plugins_play to load vars for managed_node2 30583 1726853740.29457: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853740.29461: Calling groups_plugins_play to load vars for managed_node2 30583 1726853740.30189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853740.31391: done with get_vars() 30583 1726853740.31407: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 13:35:40 -0400 (0:00:00.082) 0:01:15.651 ****** 30583 1726853740.31461: entering _queue_task() for managed_node2/include_role 30583 1726853740.31723: worker is 1 (out of 1 available) 30583 1726853740.31738: exiting _queue_task() for managed_node2/include_role 30583 1726853740.31753: done queuing things up, now waiting for results queue to drain 30583 1726853740.31755: waiting for pending results... 30583 1726853740.31946: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853740.32031: in run() - task 02083763-bbaf-05ea-abc5-0000000017d0 30583 1726853740.32042: variable 'ansible_search_path' from source: unknown 30583 1726853740.32047: variable 'ansible_search_path' from source: unknown 30583 1726853740.32079: calling self._execute() 30583 1726853740.32155: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.32159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.32172: variable 'omit' from source: magic vars 30583 1726853740.32456: variable 'ansible_distribution_major_version' from source: facts 30583 1726853740.32469: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853740.32476: _execute() done 30583 1726853740.32479: dumping result to json 30583 1726853740.32481: done dumping result, returning 30583 1726853740.32489: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-0000000017d0] 30583 1726853740.32492: sending task result for task 02083763-bbaf-05ea-abc5-0000000017d0 30583 1726853740.32597: done sending task result for task 02083763-bbaf-05ea-abc5-0000000017d0 30583 1726853740.32600: WORKER PROCESS EXITING 30583 1726853740.32652: no more pending results, returning what we have 30583 1726853740.32657: in VariableManager get_vars() 30583 1726853740.32699: Calling all_inventory to load vars for managed_node2 30583 1726853740.32702: Calling groups_inventory to load vars for managed_node2 30583 1726853740.32706: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853740.32718: Calling all_plugins_play to load vars for managed_node2 30583 1726853740.32721: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853740.32723: Calling groups_plugins_play to load vars for managed_node2 30583 1726853740.33591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853740.34441: done with get_vars() 30583 1726853740.34454: variable 'ansible_search_path' from source: unknown 30583 1726853740.34455: variable 'ansible_search_path' from source: unknown 30583 1726853740.34565: variable 'omit' from source: magic vars 30583 1726853740.34593: variable 'omit' from source: magic vars 30583 1726853740.34603: variable 'omit' from source: magic vars 30583 1726853740.34606: we have included files to process 30583 1726853740.34606: generating all_blocks data 30583 1726853740.34607: done generating all_blocks data 30583 1726853740.34608: processing included file: fedora.linux_system_roles.network 30583 1726853740.34621: in VariableManager get_vars() 30583 1726853740.34629: done with get_vars() 30583 1726853740.34648: in VariableManager get_vars() 30583 1726853740.34660: done with get_vars() 30583 1726853740.34689: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853740.34756: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853740.34806: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853740.35067: in VariableManager get_vars() 30583 1726853740.35082: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853740.36293: iterating over new_blocks loaded from include file 30583 1726853740.36295: in VariableManager get_vars() 30583 1726853740.36305: done with get_vars() 30583 1726853740.36306: filtering new block on tags 30583 1726853740.36463: done filtering new block on tags 30583 1726853740.36466: in VariableManager get_vars() 30583 1726853740.36478: done with get_vars() 30583 1726853740.36479: filtering new block on tags 30583 1726853740.36489: done filtering new block on tags 30583 1726853740.36490: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853740.36494: extending task lists for all hosts with included blocks 30583 1726853740.36588: done extending task lists 30583 1726853740.36589: done processing included files 30583 1726853740.36589: results queue empty 30583 1726853740.36590: checking for any_errors_fatal 30583 1726853740.36592: done checking for any_errors_fatal 30583 1726853740.36592: checking for max_fail_percentage 30583 1726853740.36593: done checking for max_fail_percentage 30583 1726853740.36594: checking to see if all hosts have failed and the running result is not ok 30583 1726853740.36594: done checking to see if all hosts have failed 30583 1726853740.36595: getting the remaining hosts for this loop 30583 1726853740.36596: done getting the remaining hosts for this loop 30583 1726853740.36597: getting the next task for host managed_node2 30583 1726853740.36600: done getting next task for host managed_node2 30583 1726853740.36602: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853740.36604: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853740.36610: getting variables 30583 1726853740.36611: in VariableManager get_vars() 30583 1726853740.36619: Calling all_inventory to load vars for managed_node2 30583 1726853740.36621: Calling groups_inventory to load vars for managed_node2 30583 1726853740.36622: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853740.36626: Calling all_plugins_play to load vars for managed_node2 30583 1726853740.36627: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853740.36629: Calling groups_plugins_play to load vars for managed_node2 30583 1726853740.37269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853740.38189: done with get_vars() 30583 1726853740.38204: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:35:40 -0400 (0:00:00.067) 0:01:15.719 ****** 30583 1726853740.38253: entering _queue_task() for managed_node2/include_tasks 30583 1726853740.38519: worker is 1 (out of 1 available) 30583 1726853740.38532: exiting _queue_task() for managed_node2/include_tasks 30583 1726853740.38545: done queuing things up, now waiting for results queue to drain 30583 1726853740.38547: waiting for pending results... 30583 1726853740.38735: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853740.38827: in run() - task 02083763-bbaf-05ea-abc5-00000000183a 30583 1726853740.38837: variable 'ansible_search_path' from source: unknown 30583 1726853740.38840: variable 'ansible_search_path' from source: unknown 30583 1726853740.38872: calling self._execute() 30583 1726853740.38947: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.38951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.38960: variable 'omit' from source: magic vars 30583 1726853740.39244: variable 'ansible_distribution_major_version' from source: facts 30583 1726853740.39253: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853740.39258: _execute() done 30583 1726853740.39265: dumping result to json 30583 1726853740.39268: done dumping result, returning 30583 1726853740.39276: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-00000000183a] 30583 1726853740.39279: sending task result for task 02083763-bbaf-05ea-abc5-00000000183a 30583 1726853740.39361: done sending task result for task 02083763-bbaf-05ea-abc5-00000000183a 30583 1726853740.39363: WORKER PROCESS EXITING 30583 1726853740.39412: no more pending results, returning what we have 30583 1726853740.39418: in VariableManager get_vars() 30583 1726853740.39467: Calling all_inventory to load vars for managed_node2 30583 1726853740.39470: Calling groups_inventory to load vars for managed_node2 30583 1726853740.39480: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853740.39491: Calling all_plugins_play to load vars for managed_node2 30583 1726853740.39494: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853740.39497: Calling groups_plugins_play to load vars for managed_node2 30583 1726853740.40269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853740.41120: done with get_vars() 30583 1726853740.41134: variable 'ansible_search_path' from source: unknown 30583 1726853740.41135: variable 'ansible_search_path' from source: unknown 30583 1726853740.41160: we have included files to process 30583 1726853740.41161: generating all_blocks data 30583 1726853740.41162: done generating all_blocks data 30583 1726853740.41164: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853740.41165: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853740.41166: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853740.41535: done processing included file 30583 1726853740.41537: iterating over new_blocks loaded from include file 30583 1726853740.41538: in VariableManager get_vars() 30583 1726853740.41553: done with get_vars() 30583 1726853740.41554: filtering new block on tags 30583 1726853740.41575: done filtering new block on tags 30583 1726853740.41577: in VariableManager get_vars() 30583 1726853740.41591: done with get_vars() 30583 1726853740.41592: filtering new block on tags 30583 1726853740.41616: done filtering new block on tags 30583 1726853740.41617: in VariableManager get_vars() 30583 1726853740.41630: done with get_vars() 30583 1726853740.41631: filtering new block on tags 30583 1726853740.41656: done filtering new block on tags 30583 1726853740.41658: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853740.41662: extending task lists for all hosts with included blocks 30583 1726853740.42614: done extending task lists 30583 1726853740.42615: done processing included files 30583 1726853740.42616: results queue empty 30583 1726853740.42617: checking for any_errors_fatal 30583 1726853740.42620: done checking for any_errors_fatal 30583 1726853740.42620: checking for max_fail_percentage 30583 1726853740.42621: done checking for max_fail_percentage 30583 1726853740.42622: checking to see if all hosts have failed and the running result is not ok 30583 1726853740.42622: done checking to see if all hosts have failed 30583 1726853740.42623: getting the remaining hosts for this loop 30583 1726853740.42624: done getting the remaining hosts for this loop 30583 1726853740.42625: getting the next task for host managed_node2 30583 1726853740.42629: done getting next task for host managed_node2 30583 1726853740.42630: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853740.42633: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853740.42641: getting variables 30583 1726853740.42642: in VariableManager get_vars() 30583 1726853740.42651: Calling all_inventory to load vars for managed_node2 30583 1726853740.42652: Calling groups_inventory to load vars for managed_node2 30583 1726853740.42654: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853740.42657: Calling all_plugins_play to load vars for managed_node2 30583 1726853740.42660: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853740.42661: Calling groups_plugins_play to load vars for managed_node2 30583 1726853740.47137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853740.47995: done with get_vars() 30583 1726853740.48015: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:35:40 -0400 (0:00:00.098) 0:01:15.818 ****** 30583 1726853740.48078: entering _queue_task() for managed_node2/setup 30583 1726853740.48357: worker is 1 (out of 1 available) 30583 1726853740.48374: exiting _queue_task() for managed_node2/setup 30583 1726853740.48386: done queuing things up, now waiting for results queue to drain 30583 1726853740.48389: waiting for pending results... 30583 1726853740.48584: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853740.48698: in run() - task 02083763-bbaf-05ea-abc5-000000001897 30583 1726853740.48708: variable 'ansible_search_path' from source: unknown 30583 1726853740.48712: variable 'ansible_search_path' from source: unknown 30583 1726853740.48745: calling self._execute() 30583 1726853740.48820: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.48828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.48832: variable 'omit' from source: magic vars 30583 1726853740.49131: variable 'ansible_distribution_major_version' from source: facts 30583 1726853740.49141: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853740.49299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853740.50791: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853740.50844: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853740.50874: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853740.50900: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853740.50922: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853740.50981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853740.51003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853740.51023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853740.51050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853740.51061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853740.51101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853740.51116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853740.51136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853740.51160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853740.51175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853740.51288: variable '__network_required_facts' from source: role '' defaults 30583 1726853740.51295: variable 'ansible_facts' from source: unknown 30583 1726853740.51744: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853740.51748: when evaluation is False, skipping this task 30583 1726853740.51750: _execute() done 30583 1726853740.51753: dumping result to json 30583 1726853740.51755: done dumping result, returning 30583 1726853740.51762: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-000000001897] 30583 1726853740.51768: sending task result for task 02083763-bbaf-05ea-abc5-000000001897 30583 1726853740.51858: done sending task result for task 02083763-bbaf-05ea-abc5-000000001897 30583 1726853740.51861: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853740.51923: no more pending results, returning what we have 30583 1726853740.51928: results queue empty 30583 1726853740.51929: checking for any_errors_fatal 30583 1726853740.51930: done checking for any_errors_fatal 30583 1726853740.51931: checking for max_fail_percentage 30583 1726853740.51933: done checking for max_fail_percentage 30583 1726853740.51933: checking to see if all hosts have failed and the running result is not ok 30583 1726853740.51934: done checking to see if all hosts have failed 30583 1726853740.51935: getting the remaining hosts for this loop 30583 1726853740.51937: done getting the remaining hosts for this loop 30583 1726853740.51940: getting the next task for host managed_node2 30583 1726853740.51951: done getting next task for host managed_node2 30583 1726853740.51954: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853740.51961: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853740.51984: getting variables 30583 1726853740.51986: in VariableManager get_vars() 30583 1726853740.52025: Calling all_inventory to load vars for managed_node2 30583 1726853740.52028: Calling groups_inventory to load vars for managed_node2 30583 1726853740.52031: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853740.52039: Calling all_plugins_play to load vars for managed_node2 30583 1726853740.52042: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853740.52051: Calling groups_plugins_play to load vars for managed_node2 30583 1726853740.52852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853740.53728: done with get_vars() 30583 1726853740.53744: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:35:40 -0400 (0:00:00.057) 0:01:15.875 ****** 30583 1726853740.53818: entering _queue_task() for managed_node2/stat 30583 1726853740.54057: worker is 1 (out of 1 available) 30583 1726853740.54075: exiting _queue_task() for managed_node2/stat 30583 1726853740.54088: done queuing things up, now waiting for results queue to drain 30583 1726853740.54090: waiting for pending results... 30583 1726853740.54278: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853740.54395: in run() - task 02083763-bbaf-05ea-abc5-000000001899 30583 1726853740.54405: variable 'ansible_search_path' from source: unknown 30583 1726853740.54409: variable 'ansible_search_path' from source: unknown 30583 1726853740.54441: calling self._execute() 30583 1726853740.54519: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.54523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.54535: variable 'omit' from source: magic vars 30583 1726853740.54815: variable 'ansible_distribution_major_version' from source: facts 30583 1726853740.54825: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853740.54939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853740.55143: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853740.55179: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853740.55235: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853740.55265: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853740.55331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853740.55348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853740.55369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853740.55387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853740.55456: variable '__network_is_ostree' from source: set_fact 30583 1726853740.55464: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853740.55467: when evaluation is False, skipping this task 30583 1726853740.55469: _execute() done 30583 1726853740.55474: dumping result to json 30583 1726853740.55477: done dumping result, returning 30583 1726853740.55484: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-000000001899] 30583 1726853740.55487: sending task result for task 02083763-bbaf-05ea-abc5-000000001899 30583 1726853740.55568: done sending task result for task 02083763-bbaf-05ea-abc5-000000001899 30583 1726853740.55572: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853740.55627: no more pending results, returning what we have 30583 1726853740.55631: results queue empty 30583 1726853740.55632: checking for any_errors_fatal 30583 1726853740.55641: done checking for any_errors_fatal 30583 1726853740.55642: checking for max_fail_percentage 30583 1726853740.55644: done checking for max_fail_percentage 30583 1726853740.55645: checking to see if all hosts have failed and the running result is not ok 30583 1726853740.55646: done checking to see if all hosts have failed 30583 1726853740.55646: getting the remaining hosts for this loop 30583 1726853740.55649: done getting the remaining hosts for this loop 30583 1726853740.55652: getting the next task for host managed_node2 30583 1726853740.55659: done getting next task for host managed_node2 30583 1726853740.55662: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853740.55668: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853740.55691: getting variables 30583 1726853740.55693: in VariableManager get_vars() 30583 1726853740.55727: Calling all_inventory to load vars for managed_node2 30583 1726853740.55729: Calling groups_inventory to load vars for managed_node2 30583 1726853740.55731: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853740.55740: Calling all_plugins_play to load vars for managed_node2 30583 1726853740.55743: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853740.55746: Calling groups_plugins_play to load vars for managed_node2 30583 1726853740.56644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853740.57528: done with get_vars() 30583 1726853740.57542: done getting variables 30583 1726853740.57586: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:35:40 -0400 (0:00:00.037) 0:01:15.913 ****** 30583 1726853740.57612: entering _queue_task() for managed_node2/set_fact 30583 1726853740.57843: worker is 1 (out of 1 available) 30583 1726853740.57857: exiting _queue_task() for managed_node2/set_fact 30583 1726853740.57868: done queuing things up, now waiting for results queue to drain 30583 1726853740.57869: waiting for pending results... 30583 1726853740.58051: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853740.58154: in run() - task 02083763-bbaf-05ea-abc5-00000000189a 30583 1726853740.58167: variable 'ansible_search_path' from source: unknown 30583 1726853740.58173: variable 'ansible_search_path' from source: unknown 30583 1726853740.58203: calling self._execute() 30583 1726853740.58281: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.58284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.58294: variable 'omit' from source: magic vars 30583 1726853740.58577: variable 'ansible_distribution_major_version' from source: facts 30583 1726853740.58587: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853740.58706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853740.58907: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853740.58938: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853740.58999: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853740.59027: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853740.59096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853740.59116: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853740.59133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853740.59150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853740.59220: variable '__network_is_ostree' from source: set_fact 30583 1726853740.59226: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853740.59229: when evaluation is False, skipping this task 30583 1726853740.59232: _execute() done 30583 1726853740.59234: dumping result to json 30583 1726853740.59236: done dumping result, returning 30583 1726853740.59244: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-00000000189a] 30583 1726853740.59246: sending task result for task 02083763-bbaf-05ea-abc5-00000000189a 30583 1726853740.59330: done sending task result for task 02083763-bbaf-05ea-abc5-00000000189a 30583 1726853740.59332: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853740.59383: no more pending results, returning what we have 30583 1726853740.59387: results queue empty 30583 1726853740.59388: checking for any_errors_fatal 30583 1726853740.59395: done checking for any_errors_fatal 30583 1726853740.59395: checking for max_fail_percentage 30583 1726853740.59397: done checking for max_fail_percentage 30583 1726853740.59398: checking to see if all hosts have failed and the running result is not ok 30583 1726853740.59399: done checking to see if all hosts have failed 30583 1726853740.59400: getting the remaining hosts for this loop 30583 1726853740.59401: done getting the remaining hosts for this loop 30583 1726853740.59405: getting the next task for host managed_node2 30583 1726853740.59415: done getting next task for host managed_node2 30583 1726853740.59421: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853740.59426: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853740.59446: getting variables 30583 1726853740.59448: in VariableManager get_vars() 30583 1726853740.59480: Calling all_inventory to load vars for managed_node2 30583 1726853740.59483: Calling groups_inventory to load vars for managed_node2 30583 1726853740.59485: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853740.59493: Calling all_plugins_play to load vars for managed_node2 30583 1726853740.59495: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853740.59498: Calling groups_plugins_play to load vars for managed_node2 30583 1726853740.60255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853740.61234: done with get_vars() 30583 1726853740.61248: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:35:40 -0400 (0:00:00.037) 0:01:15.950 ****** 30583 1726853740.61319: entering _queue_task() for managed_node2/service_facts 30583 1726853740.61543: worker is 1 (out of 1 available) 30583 1726853740.61557: exiting _queue_task() for managed_node2/service_facts 30583 1726853740.61568: done queuing things up, now waiting for results queue to drain 30583 1726853740.61569: waiting for pending results... 30583 1726853740.61753: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853740.61860: in run() - task 02083763-bbaf-05ea-abc5-00000000189c 30583 1726853740.61874: variable 'ansible_search_path' from source: unknown 30583 1726853740.61877: variable 'ansible_search_path' from source: unknown 30583 1726853740.61909: calling self._execute() 30583 1726853740.61983: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.61987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.61995: variable 'omit' from source: magic vars 30583 1726853740.62283: variable 'ansible_distribution_major_version' from source: facts 30583 1726853740.62293: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853740.62298: variable 'omit' from source: magic vars 30583 1726853740.62348: variable 'omit' from source: magic vars 30583 1726853740.62375: variable 'omit' from source: magic vars 30583 1726853740.62407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853740.62433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853740.62453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853740.62468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853740.62479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853740.62504: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853740.62507: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.62510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.62584: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853740.62588: Set connection var ansible_timeout to 10 30583 1726853740.62591: Set connection var ansible_connection to ssh 30583 1726853740.62596: Set connection var ansible_shell_executable to /bin/sh 30583 1726853740.62599: Set connection var ansible_shell_type to sh 30583 1726853740.62606: Set connection var ansible_pipelining to False 30583 1726853740.62624: variable 'ansible_shell_executable' from source: unknown 30583 1726853740.62627: variable 'ansible_connection' from source: unknown 30583 1726853740.62630: variable 'ansible_module_compression' from source: unknown 30583 1726853740.62633: variable 'ansible_shell_type' from source: unknown 30583 1726853740.62637: variable 'ansible_shell_executable' from source: unknown 30583 1726853740.62639: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853740.62641: variable 'ansible_pipelining' from source: unknown 30583 1726853740.62644: variable 'ansible_timeout' from source: unknown 30583 1726853740.62646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853740.62791: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853740.62799: variable 'omit' from source: magic vars 30583 1726853740.62804: starting attempt loop 30583 1726853740.62807: running the handler 30583 1726853740.62818: _low_level_execute_command(): starting 30583 1726853740.62825: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853740.63345: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853740.63349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853740.63353: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853740.63356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853740.63400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853740.63403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853740.63405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853740.63490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853740.65210: stdout chunk (state=3): >>>/root <<< 30583 1726853740.65312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853740.65341: stderr chunk (state=3): >>><<< 30583 1726853740.65345: stdout chunk (state=3): >>><<< 30583 1726853740.65365: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853740.65377: _low_level_execute_command(): starting 30583 1726853740.65383: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111 `" && echo ansible-tmp-1726853740.6536243-34143-164351165287111="` echo /root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111 `" ) && sleep 0' 30583 1726853740.65823: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853740.65827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853740.65830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853740.65841: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853740.65844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853740.65877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853740.65881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853740.65892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853740.65961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853740.67927: stdout chunk (state=3): >>>ansible-tmp-1726853740.6536243-34143-164351165287111=/root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111 <<< 30583 1726853740.68035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853740.68059: stderr chunk (state=3): >>><<< 30583 1726853740.68063: stdout chunk (state=3): >>><<< 30583 1726853740.68083: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853740.6536243-34143-164351165287111=/root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853740.68121: variable 'ansible_module_compression' from source: unknown 30583 1726853740.68156: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853740.68194: variable 'ansible_facts' from source: unknown 30583 1726853740.68250: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111/AnsiballZ_service_facts.py 30583 1726853740.68353: Sending initial data 30583 1726853740.68357: Sent initial data (162 bytes) 30583 1726853740.68793: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853740.68797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853740.68799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853740.68811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853740.68863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853740.68866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853740.68948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853740.70591: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853740.70662: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853740.70741: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpc07q5qbd /root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111/AnsiballZ_service_facts.py <<< 30583 1726853740.70744: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111/AnsiballZ_service_facts.py" <<< 30583 1726853740.70810: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpc07q5qbd" to remote "/root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111/AnsiballZ_service_facts.py" <<< 30583 1726853740.70813: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111/AnsiballZ_service_facts.py" <<< 30583 1726853740.71500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853740.71543: stderr chunk (state=3): >>><<< 30583 1726853740.71546: stdout chunk (state=3): >>><<< 30583 1726853740.71606: done transferring module to remote 30583 1726853740.71615: _low_level_execute_command(): starting 30583 1726853740.71620: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111/ /root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111/AnsiballZ_service_facts.py && sleep 0' 30583 1726853740.72040: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853740.72049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853740.72068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853740.72086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853740.72096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853740.72139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853740.72143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853740.72147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853740.72218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853740.74070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853740.74091: stderr chunk (state=3): >>><<< 30583 1726853740.74094: stdout chunk (state=3): >>><<< 30583 1726853740.74109: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853740.74112: _low_level_execute_command(): starting 30583 1726853740.74114: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111/AnsiballZ_service_facts.py && sleep 0' 30583 1726853740.74533: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853740.74536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853740.74539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853740.74541: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853740.74543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853740.74592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853740.74595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853740.74675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853742.39405: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30583 1726853742.39470: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 30583 1726853742.39485: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 30583 1726853742.39513: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853742.41079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853742.41091: stderr chunk (state=3): >>><<< 30583 1726853742.41094: stdout chunk (state=3): >>><<< 30583 1726853742.41120: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853742.41584: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853742.41593: _low_level_execute_command(): starting 30583 1726853742.41597: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853740.6536243-34143-164351165287111/ > /dev/null 2>&1 && sleep 0' 30583 1726853742.42033: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853742.42038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853742.42041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853742.42043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853742.42097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853742.42101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853742.42105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853742.42195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853742.44120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853742.44137: stderr chunk (state=3): >>><<< 30583 1726853742.44140: stdout chunk (state=3): >>><<< 30583 1726853742.44153: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853742.44159: handler run complete 30583 1726853742.44284: variable 'ansible_facts' from source: unknown 30583 1726853742.44391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853742.44677: variable 'ansible_facts' from source: unknown 30583 1726853742.44760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853742.44879: attempt loop complete, returning result 30583 1726853742.44882: _execute() done 30583 1726853742.44885: dumping result to json 30583 1726853742.44922: done dumping result, returning 30583 1726853742.44930: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-00000000189c] 30583 1726853742.44933: sending task result for task 02083763-bbaf-05ea-abc5-00000000189c 30583 1726853742.45746: done sending task result for task 02083763-bbaf-05ea-abc5-00000000189c 30583 1726853742.45750: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853742.45842: no more pending results, returning what we have 30583 1726853742.45845: results queue empty 30583 1726853742.45846: checking for any_errors_fatal 30583 1726853742.45851: done checking for any_errors_fatal 30583 1726853742.45851: checking for max_fail_percentage 30583 1726853742.45853: done checking for max_fail_percentage 30583 1726853742.45854: checking to see if all hosts have failed and the running result is not ok 30583 1726853742.45855: done checking to see if all hosts have failed 30583 1726853742.45856: getting the remaining hosts for this loop 30583 1726853742.45857: done getting the remaining hosts for this loop 30583 1726853742.45860: getting the next task for host managed_node2 30583 1726853742.45867: done getting next task for host managed_node2 30583 1726853742.45870: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853742.45879: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853742.45892: getting variables 30583 1726853742.45894: in VariableManager get_vars() 30583 1726853742.45933: Calling all_inventory to load vars for managed_node2 30583 1726853742.45936: Calling groups_inventory to load vars for managed_node2 30583 1726853742.45939: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853742.45948: Calling all_plugins_play to load vars for managed_node2 30583 1726853742.45951: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853742.45957: Calling groups_plugins_play to load vars for managed_node2 30583 1726853742.47073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853742.48024: done with get_vars() 30583 1726853742.48039: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:35:42 -0400 (0:00:01.867) 0:01:17.818 ****** 30583 1726853742.48115: entering _queue_task() for managed_node2/package_facts 30583 1726853742.48362: worker is 1 (out of 1 available) 30583 1726853742.48377: exiting _queue_task() for managed_node2/package_facts 30583 1726853742.48391: done queuing things up, now waiting for results queue to drain 30583 1726853742.48392: waiting for pending results... 30583 1726853742.48672: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853742.48985: in run() - task 02083763-bbaf-05ea-abc5-00000000189d 30583 1726853742.48989: variable 'ansible_search_path' from source: unknown 30583 1726853742.48993: variable 'ansible_search_path' from source: unknown 30583 1726853742.48996: calling self._execute() 30583 1726853742.48998: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853742.49001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853742.49003: variable 'omit' from source: magic vars 30583 1726853742.49328: variable 'ansible_distribution_major_version' from source: facts 30583 1726853742.49344: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853742.49347: variable 'omit' from source: magic vars 30583 1726853742.49429: variable 'omit' from source: magic vars 30583 1726853742.49459: variable 'omit' from source: magic vars 30583 1726853742.49501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853742.49536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853742.49553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853742.49570: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853742.49590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853742.49617: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853742.49620: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853742.49622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853742.49711: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853742.49737: Set connection var ansible_timeout to 10 30583 1726853742.49751: Set connection var ansible_connection to ssh 30583 1726853742.49754: Set connection var ansible_shell_executable to /bin/sh 30583 1726853742.49756: Set connection var ansible_shell_type to sh 30583 1726853742.49758: Set connection var ansible_pipelining to False 30583 1726853742.49822: variable 'ansible_shell_executable' from source: unknown 30583 1726853742.49825: variable 'ansible_connection' from source: unknown 30583 1726853742.49828: variable 'ansible_module_compression' from source: unknown 30583 1726853742.49830: variable 'ansible_shell_type' from source: unknown 30583 1726853742.49832: variable 'ansible_shell_executable' from source: unknown 30583 1726853742.49835: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853742.49837: variable 'ansible_pipelining' from source: unknown 30583 1726853742.49839: variable 'ansible_timeout' from source: unknown 30583 1726853742.49841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853742.50058: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853742.50062: variable 'omit' from source: magic vars 30583 1726853742.50065: starting attempt loop 30583 1726853742.50069: running the handler 30583 1726853742.50077: _low_level_execute_command(): starting 30583 1726853742.50079: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853742.50742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853742.50763: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853742.50783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853742.50846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853742.50865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853742.50952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853742.52686: stdout chunk (state=3): >>>/root <<< 30583 1726853742.52941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853742.52944: stdout chunk (state=3): >>><<< 30583 1726853742.52946: stderr chunk (state=3): >>><<< 30583 1726853742.52951: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853742.52954: _low_level_execute_command(): starting 30583 1726853742.52957: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931 `" && echo ansible-tmp-1726853742.5285125-34218-177721670358931="` echo /root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931 `" ) && sleep 0' 30583 1726853742.53459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853742.53526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853742.53530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853742.53578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853742.53582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853742.53664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853742.55644: stdout chunk (state=3): >>>ansible-tmp-1726853742.5285125-34218-177721670358931=/root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931 <<< 30583 1726853742.55751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853742.55780: stderr chunk (state=3): >>><<< 30583 1726853742.55783: stdout chunk (state=3): >>><<< 30583 1726853742.55796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853742.5285125-34218-177721670358931=/root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853742.55829: variable 'ansible_module_compression' from source: unknown 30583 1726853742.55869: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853742.55925: variable 'ansible_facts' from source: unknown 30583 1726853742.56046: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931/AnsiballZ_package_facts.py 30583 1726853742.56145: Sending initial data 30583 1726853742.56148: Sent initial data (162 bytes) 30583 1726853742.56543: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853742.56547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853742.56582: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853742.56585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853742.56587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853742.56589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853742.56629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853742.56642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853742.56713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853742.58352: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853742.58359: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853742.58418: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853742.58490: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpk3zxbr_p /root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931/AnsiballZ_package_facts.py <<< 30583 1726853742.58494: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931/AnsiballZ_package_facts.py" <<< 30583 1726853742.58554: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpk3zxbr_p" to remote "/root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931/AnsiballZ_package_facts.py" <<< 30583 1726853742.59763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853742.59800: stderr chunk (state=3): >>><<< 30583 1726853742.59803: stdout chunk (state=3): >>><<< 30583 1726853742.59826: done transferring module to remote 30583 1726853742.59835: _low_level_execute_command(): starting 30583 1726853742.59840: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931/ /root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931/AnsiballZ_package_facts.py && sleep 0' 30583 1726853742.60265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853742.60268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853742.60272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853742.60275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853742.60277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853742.60324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853742.60327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853742.60403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853742.62301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853742.62321: stderr chunk (state=3): >>><<< 30583 1726853742.62324: stdout chunk (state=3): >>><<< 30583 1726853742.62335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853742.62338: _low_level_execute_command(): starting 30583 1726853742.62342: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931/AnsiballZ_package_facts.py && sleep 0' 30583 1726853742.62776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853742.62779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853742.62781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853742.62783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853742.62791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853742.62835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853742.62839: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853742.62920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853743.07953: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30583 1726853743.08011: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30583 1726853743.08056: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30583 1726853743.08072: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30583 1726853743.08083: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30583 1726853743.08090: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30583 1726853743.08100: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30583 1726853743.08106: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30583 1726853743.08115: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30583 1726853743.08135: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30583 1726853743.08160: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30583 1726853743.08164: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853743.10043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853743.10052: stderr chunk (state=3): >>><<< 30583 1726853743.10055: stdout chunk (state=3): >>><<< 30583 1726853743.10094: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853743.11303: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853743.11321: _low_level_execute_command(): starting 30583 1726853743.11324: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853742.5285125-34218-177721670358931/ > /dev/null 2>&1 && sleep 0' 30583 1726853743.11765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853743.11768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853743.11770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853743.11775: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853743.11777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853743.11829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853743.11833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853743.11836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853743.11918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853743.14078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853743.14082: stdout chunk (state=3): >>><<< 30583 1726853743.14085: stderr chunk (state=3): >>><<< 30583 1726853743.14087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853743.14089: handler run complete 30583 1726853743.14866: variable 'ansible_facts' from source: unknown 30583 1726853743.15377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853743.17389: variable 'ansible_facts' from source: unknown 30583 1726853743.17887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853743.18648: attempt loop complete, returning result 30583 1726853743.18675: _execute() done 30583 1726853743.18683: dumping result to json 30583 1726853743.18911: done dumping result, returning 30583 1726853743.18926: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-00000000189d] 30583 1726853743.18935: sending task result for task 02083763-bbaf-05ea-abc5-00000000189d 30583 1726853743.21510: done sending task result for task 02083763-bbaf-05ea-abc5-00000000189d 30583 1726853743.21514: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853743.21712: no more pending results, returning what we have 30583 1726853743.21715: results queue empty 30583 1726853743.21716: checking for any_errors_fatal 30583 1726853743.21722: done checking for any_errors_fatal 30583 1726853743.21723: checking for max_fail_percentage 30583 1726853743.21724: done checking for max_fail_percentage 30583 1726853743.21725: checking to see if all hosts have failed and the running result is not ok 30583 1726853743.21726: done checking to see if all hosts have failed 30583 1726853743.21726: getting the remaining hosts for this loop 30583 1726853743.21728: done getting the remaining hosts for this loop 30583 1726853743.21731: getting the next task for host managed_node2 30583 1726853743.21739: done getting next task for host managed_node2 30583 1726853743.21743: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853743.21748: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853743.21877: getting variables 30583 1726853743.21879: in VariableManager get_vars() 30583 1726853743.21910: Calling all_inventory to load vars for managed_node2 30583 1726853743.21913: Calling groups_inventory to load vars for managed_node2 30583 1726853743.21915: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853743.21923: Calling all_plugins_play to load vars for managed_node2 30583 1726853743.21926: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853743.21928: Calling groups_plugins_play to load vars for managed_node2 30583 1726853743.23341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853743.25005: done with get_vars() 30583 1726853743.25033: done getting variables 30583 1726853743.25098: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:43 -0400 (0:00:00.770) 0:01:18.588 ****** 30583 1726853743.25147: entering _queue_task() for managed_node2/debug 30583 1726853743.25673: worker is 1 (out of 1 available) 30583 1726853743.25687: exiting _queue_task() for managed_node2/debug 30583 1726853743.25699: done queuing things up, now waiting for results queue to drain 30583 1726853743.25701: waiting for pending results... 30583 1726853743.25992: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853743.26142: in run() - task 02083763-bbaf-05ea-abc5-00000000183b 30583 1726853743.26172: variable 'ansible_search_path' from source: unknown 30583 1726853743.26181: variable 'ansible_search_path' from source: unknown 30583 1726853743.26227: calling self._execute() 30583 1726853743.26367: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853743.26372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853743.26375: variable 'omit' from source: magic vars 30583 1726853743.26808: variable 'ansible_distribution_major_version' from source: facts 30583 1726853743.26826: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853743.26861: variable 'omit' from source: magic vars 30583 1726853743.26912: variable 'omit' from source: magic vars 30583 1726853743.27002: variable 'network_provider' from source: set_fact 30583 1726853743.27079: variable 'omit' from source: magic vars 30583 1726853743.27082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853743.27116: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853743.27140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853743.27162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853743.27184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853743.27221: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853743.27295: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853743.27298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853743.27340: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853743.27351: Set connection var ansible_timeout to 10 30583 1726853743.27357: Set connection var ansible_connection to ssh 30583 1726853743.27368: Set connection var ansible_shell_executable to /bin/sh 30583 1726853743.27376: Set connection var ansible_shell_type to sh 30583 1726853743.27389: Set connection var ansible_pipelining to False 30583 1726853743.27424: variable 'ansible_shell_executable' from source: unknown 30583 1726853743.27520: variable 'ansible_connection' from source: unknown 30583 1726853743.27523: variable 'ansible_module_compression' from source: unknown 30583 1726853743.27525: variable 'ansible_shell_type' from source: unknown 30583 1726853743.27527: variable 'ansible_shell_executable' from source: unknown 30583 1726853743.27529: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853743.27531: variable 'ansible_pipelining' from source: unknown 30583 1726853743.27533: variable 'ansible_timeout' from source: unknown 30583 1726853743.27535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853743.27617: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853743.27638: variable 'omit' from source: magic vars 30583 1726853743.27652: starting attempt loop 30583 1726853743.27662: running the handler 30583 1726853743.27713: handler run complete 30583 1726853743.27737: attempt loop complete, returning result 30583 1726853743.27760: _execute() done 30583 1726853743.27764: dumping result to json 30583 1726853743.27766: done dumping result, returning 30583 1726853743.27848: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-00000000183b] 30583 1726853743.27852: sending task result for task 02083763-bbaf-05ea-abc5-00000000183b 30583 1726853743.27932: done sending task result for task 02083763-bbaf-05ea-abc5-00000000183b 30583 1726853743.27936: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853743.28028: no more pending results, returning what we have 30583 1726853743.28032: results queue empty 30583 1726853743.28033: checking for any_errors_fatal 30583 1726853743.28273: done checking for any_errors_fatal 30583 1726853743.28274: checking for max_fail_percentage 30583 1726853743.28276: done checking for max_fail_percentage 30583 1726853743.28277: checking to see if all hosts have failed and the running result is not ok 30583 1726853743.28278: done checking to see if all hosts have failed 30583 1726853743.28279: getting the remaining hosts for this loop 30583 1726853743.28280: done getting the remaining hosts for this loop 30583 1726853743.28284: getting the next task for host managed_node2 30583 1726853743.28291: done getting next task for host managed_node2 30583 1726853743.28294: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853743.28300: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853743.28313: getting variables 30583 1726853743.28314: in VariableManager get_vars() 30583 1726853743.28350: Calling all_inventory to load vars for managed_node2 30583 1726853743.28352: Calling groups_inventory to load vars for managed_node2 30583 1726853743.28355: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853743.28365: Calling all_plugins_play to load vars for managed_node2 30583 1726853743.28368: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853743.28376: Calling groups_plugins_play to load vars for managed_node2 30583 1726853743.29899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853743.31542: done with get_vars() 30583 1726853743.31576: done getting variables 30583 1726853743.31646: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:43 -0400 (0:00:00.065) 0:01:18.654 ****** 30583 1726853743.31694: entering _queue_task() for managed_node2/fail 30583 1726853743.32179: worker is 1 (out of 1 available) 30583 1726853743.32191: exiting _queue_task() for managed_node2/fail 30583 1726853743.32203: done queuing things up, now waiting for results queue to drain 30583 1726853743.32205: waiting for pending results... 30583 1726853743.32734: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853743.32739: in run() - task 02083763-bbaf-05ea-abc5-00000000183c 30583 1726853743.32755: variable 'ansible_search_path' from source: unknown 30583 1726853743.32765: variable 'ansible_search_path' from source: unknown 30583 1726853743.32827: calling self._execute() 30583 1726853743.33167: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853743.33172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853743.33176: variable 'omit' from source: magic vars 30583 1726853743.34177: variable 'ansible_distribution_major_version' from source: facts 30583 1726853743.34181: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853743.34334: variable 'network_state' from source: role '' defaults 30583 1726853743.34350: Evaluated conditional (network_state != {}): False 30583 1726853743.34360: when evaluation is False, skipping this task 30583 1726853743.34368: _execute() done 30583 1726853743.34377: dumping result to json 30583 1726853743.34385: done dumping result, returning 30583 1726853743.34452: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-00000000183c] 30583 1726853743.34466: sending task result for task 02083763-bbaf-05ea-abc5-00000000183c skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853743.34815: no more pending results, returning what we have 30583 1726853743.34819: results queue empty 30583 1726853743.34820: checking for any_errors_fatal 30583 1726853743.34827: done checking for any_errors_fatal 30583 1726853743.34828: checking for max_fail_percentage 30583 1726853743.34830: done checking for max_fail_percentage 30583 1726853743.34831: checking to see if all hosts have failed and the running result is not ok 30583 1726853743.34832: done checking to see if all hosts have failed 30583 1726853743.34832: getting the remaining hosts for this loop 30583 1726853743.34834: done getting the remaining hosts for this loop 30583 1726853743.34839: getting the next task for host managed_node2 30583 1726853743.34849: done getting next task for host managed_node2 30583 1726853743.34854: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853743.34862: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853743.34889: getting variables 30583 1726853743.34891: in VariableManager get_vars() 30583 1726853743.34934: Calling all_inventory to load vars for managed_node2 30583 1726853743.34937: Calling groups_inventory to load vars for managed_node2 30583 1726853743.34940: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853743.34952: Calling all_plugins_play to load vars for managed_node2 30583 1726853743.34955: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853743.34961: Calling groups_plugins_play to load vars for managed_node2 30583 1726853743.35718: done sending task result for task 02083763-bbaf-05ea-abc5-00000000183c 30583 1726853743.35722: WORKER PROCESS EXITING 30583 1726853743.37883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853743.40683: done with get_vars() 30583 1726853743.40709: done getting variables 30583 1726853743.40779: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:43 -0400 (0:00:00.091) 0:01:18.745 ****** 30583 1726853743.40815: entering _queue_task() for managed_node2/fail 30583 1726853743.41205: worker is 1 (out of 1 available) 30583 1726853743.41217: exiting _queue_task() for managed_node2/fail 30583 1726853743.41229: done queuing things up, now waiting for results queue to drain 30583 1726853743.41230: waiting for pending results... 30583 1726853743.41552: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853743.41728: in run() - task 02083763-bbaf-05ea-abc5-00000000183d 30583 1726853743.41748: variable 'ansible_search_path' from source: unknown 30583 1726853743.41755: variable 'ansible_search_path' from source: unknown 30583 1726853743.41801: calling self._execute() 30583 1726853743.41913: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853743.41927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853743.41949: variable 'omit' from source: magic vars 30583 1726853743.42355: variable 'ansible_distribution_major_version' from source: facts 30583 1726853743.42383: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853743.42517: variable 'network_state' from source: role '' defaults 30583 1726853743.42533: Evaluated conditional (network_state != {}): False 30583 1726853743.42541: when evaluation is False, skipping this task 30583 1726853743.42549: _execute() done 30583 1726853743.42556: dumping result to json 30583 1726853743.42567: done dumping result, returning 30583 1726853743.42582: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-00000000183d] 30583 1726853743.42599: sending task result for task 02083763-bbaf-05ea-abc5-00000000183d skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853743.42837: no more pending results, returning what we have 30583 1726853743.42842: results queue empty 30583 1726853743.42843: checking for any_errors_fatal 30583 1726853743.42852: done checking for any_errors_fatal 30583 1726853743.42853: checking for max_fail_percentage 30583 1726853743.42856: done checking for max_fail_percentage 30583 1726853743.42857: checking to see if all hosts have failed and the running result is not ok 30583 1726853743.42860: done checking to see if all hosts have failed 30583 1726853743.42861: getting the remaining hosts for this loop 30583 1726853743.42863: done getting the remaining hosts for this loop 30583 1726853743.42867: getting the next task for host managed_node2 30583 1726853743.42879: done getting next task for host managed_node2 30583 1726853743.42883: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853743.42889: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853743.42916: getting variables 30583 1726853743.42918: in VariableManager get_vars() 30583 1726853743.42966: Calling all_inventory to load vars for managed_node2 30583 1726853743.42969: Calling groups_inventory to load vars for managed_node2 30583 1726853743.43084: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853743.43098: Calling all_plugins_play to load vars for managed_node2 30583 1726853743.43101: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853743.43105: Calling groups_plugins_play to load vars for managed_node2 30583 1726853743.43690: done sending task result for task 02083763-bbaf-05ea-abc5-00000000183d 30583 1726853743.43693: WORKER PROCESS EXITING 30583 1726853743.44730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853743.46414: done with get_vars() 30583 1726853743.46451: done getting variables 30583 1726853743.46516: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:43 -0400 (0:00:00.057) 0:01:18.802 ****** 30583 1726853743.46562: entering _queue_task() for managed_node2/fail 30583 1726853743.46953: worker is 1 (out of 1 available) 30583 1726853743.46970: exiting _queue_task() for managed_node2/fail 30583 1726853743.46984: done queuing things up, now waiting for results queue to drain 30583 1726853743.46986: waiting for pending results... 30583 1726853743.47308: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853743.47485: in run() - task 02083763-bbaf-05ea-abc5-00000000183e 30583 1726853743.47506: variable 'ansible_search_path' from source: unknown 30583 1726853743.47513: variable 'ansible_search_path' from source: unknown 30583 1726853743.47564: calling self._execute() 30583 1726853743.47678: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853743.47689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853743.47703: variable 'omit' from source: magic vars 30583 1726853743.48112: variable 'ansible_distribution_major_version' from source: facts 30583 1726853743.48129: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853743.48333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853743.50847: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853743.50978: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853743.50984: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853743.51032: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853743.51067: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853743.51160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.51608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.51628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.51657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.51672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.51753: variable 'ansible_distribution_major_version' from source: facts 30583 1726853743.51768: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853743.51852: variable 'ansible_distribution' from source: facts 30583 1726853743.51856: variable '__network_rh_distros' from source: role '' defaults 30583 1726853743.51867: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853743.52027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.52044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.52060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.52092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.52103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.52137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.52153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.52173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.52201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.52211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.52241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.52256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.52277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.52305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.52316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.52512: variable 'network_connections' from source: include params 30583 1726853743.52515: variable 'interface' from source: play vars 30583 1726853743.52566: variable 'interface' from source: play vars 30583 1726853743.52578: variable 'network_state' from source: role '' defaults 30583 1726853743.52624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853743.52741: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853743.52773: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853743.52795: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853743.52816: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853743.52849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853743.52867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853743.52892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.52909: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853743.52936: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853743.52941: when evaluation is False, skipping this task 30583 1726853743.52944: _execute() done 30583 1726853743.52947: dumping result to json 30583 1726853743.52949: done dumping result, returning 30583 1726853743.52956: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-00000000183e] 30583 1726853743.52959: sending task result for task 02083763-bbaf-05ea-abc5-00000000183e 30583 1726853743.53049: done sending task result for task 02083763-bbaf-05ea-abc5-00000000183e 30583 1726853743.53052: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853743.53108: no more pending results, returning what we have 30583 1726853743.53111: results queue empty 30583 1726853743.53112: checking for any_errors_fatal 30583 1726853743.53118: done checking for any_errors_fatal 30583 1726853743.53119: checking for max_fail_percentage 30583 1726853743.53125: done checking for max_fail_percentage 30583 1726853743.53126: checking to see if all hosts have failed and the running result is not ok 30583 1726853743.53127: done checking to see if all hosts have failed 30583 1726853743.53127: getting the remaining hosts for this loop 30583 1726853743.53130: done getting the remaining hosts for this loop 30583 1726853743.53134: getting the next task for host managed_node2 30583 1726853743.53154: done getting next task for host managed_node2 30583 1726853743.53159: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853743.53165: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853743.53189: getting variables 30583 1726853743.53191: in VariableManager get_vars() 30583 1726853743.53230: Calling all_inventory to load vars for managed_node2 30583 1726853743.53233: Calling groups_inventory to load vars for managed_node2 30583 1726853743.53235: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853743.53244: Calling all_plugins_play to load vars for managed_node2 30583 1726853743.53247: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853743.53249: Calling groups_plugins_play to load vars for managed_node2 30583 1726853743.55367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853743.56237: done with get_vars() 30583 1726853743.56260: done getting variables 30583 1726853743.56306: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:43 -0400 (0:00:00.097) 0:01:18.900 ****** 30583 1726853743.56332: entering _queue_task() for managed_node2/dnf 30583 1726853743.56645: worker is 1 (out of 1 available) 30583 1726853743.56663: exiting _queue_task() for managed_node2/dnf 30583 1726853743.56678: done queuing things up, now waiting for results queue to drain 30583 1726853743.56679: waiting for pending results... 30583 1726853743.57594: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853743.57604: in run() - task 02083763-bbaf-05ea-abc5-00000000183f 30583 1726853743.57609: variable 'ansible_search_path' from source: unknown 30583 1726853743.57612: variable 'ansible_search_path' from source: unknown 30583 1726853743.57614: calling self._execute() 30583 1726853743.57617: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853743.57620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853743.57622: variable 'omit' from source: magic vars 30583 1726853743.57802: variable 'ansible_distribution_major_version' from source: facts 30583 1726853743.57812: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853743.58126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853743.60525: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853743.60612: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853743.60657: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853743.60706: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853743.60736: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853743.60826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.60880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.60914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.60960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.60982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.61119: variable 'ansible_distribution' from source: facts 30583 1726853743.61129: variable 'ansible_distribution_major_version' from source: facts 30583 1726853743.61149: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853743.61280: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853743.61463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.61535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.61676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.61715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.61790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.61980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.61983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.61996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.62039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.62106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.62150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.62217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.62378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.62381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.62383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.62729: variable 'network_connections' from source: include params 30583 1726853743.62778: variable 'interface' from source: play vars 30583 1726853743.62832: variable 'interface' from source: play vars 30583 1726853743.62960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853743.63193: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853743.63239: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853743.63277: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853743.63317: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853743.63368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853743.63397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853743.63490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.63493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853743.63557: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853743.63837: variable 'network_connections' from source: include params 30583 1726853743.63849: variable 'interface' from source: play vars 30583 1726853743.63919: variable 'interface' from source: play vars 30583 1726853743.63955: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853743.63964: when evaluation is False, skipping this task 30583 1726853743.63977: _execute() done 30583 1726853743.63984: dumping result to json 30583 1726853743.63992: done dumping result, returning 30583 1726853743.64005: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-00000000183f] 30583 1726853743.64015: sending task result for task 02083763-bbaf-05ea-abc5-00000000183f 30583 1726853743.64277: done sending task result for task 02083763-bbaf-05ea-abc5-00000000183f 30583 1726853743.64281: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853743.64334: no more pending results, returning what we have 30583 1726853743.64338: results queue empty 30583 1726853743.64339: checking for any_errors_fatal 30583 1726853743.64346: done checking for any_errors_fatal 30583 1726853743.64347: checking for max_fail_percentage 30583 1726853743.64349: done checking for max_fail_percentage 30583 1726853743.64350: checking to see if all hosts have failed and the running result is not ok 30583 1726853743.64350: done checking to see if all hosts have failed 30583 1726853743.64351: getting the remaining hosts for this loop 30583 1726853743.64353: done getting the remaining hosts for this loop 30583 1726853743.64357: getting the next task for host managed_node2 30583 1726853743.64366: done getting next task for host managed_node2 30583 1726853743.64372: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853743.64377: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853743.64402: getting variables 30583 1726853743.64404: in VariableManager get_vars() 30583 1726853743.64449: Calling all_inventory to load vars for managed_node2 30583 1726853743.64452: Calling groups_inventory to load vars for managed_node2 30583 1726853743.64455: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853743.64466: Calling all_plugins_play to load vars for managed_node2 30583 1726853743.64469: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853743.64581: Calling groups_plugins_play to load vars for managed_node2 30583 1726853743.66099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853743.68113: done with get_vars() 30583 1726853743.68146: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853743.68429: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:43 -0400 (0:00:00.121) 0:01:19.021 ****** 30583 1726853743.68464: entering _queue_task() for managed_node2/yum 30583 1726853743.68916: worker is 1 (out of 1 available) 30583 1726853743.68932: exiting _queue_task() for managed_node2/yum 30583 1726853743.68944: done queuing things up, now waiting for results queue to drain 30583 1726853743.68945: waiting for pending results... 30583 1726853743.69207: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853743.69368: in run() - task 02083763-bbaf-05ea-abc5-000000001840 30583 1726853743.69395: variable 'ansible_search_path' from source: unknown 30583 1726853743.69404: variable 'ansible_search_path' from source: unknown 30583 1726853743.69446: calling self._execute() 30583 1726853743.69577: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853743.69580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853743.69583: variable 'omit' from source: magic vars 30583 1726853743.69956: variable 'ansible_distribution_major_version' from source: facts 30583 1726853743.69975: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853743.70156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853743.72863: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853743.72926: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853743.73176: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853743.73179: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853743.73182: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853743.73186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.73188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.73190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.73221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.73241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.73344: variable 'ansible_distribution_major_version' from source: facts 30583 1726853743.73366: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853743.73377: when evaluation is False, skipping this task 30583 1726853743.73385: _execute() done 30583 1726853743.73393: dumping result to json 30583 1726853743.73401: done dumping result, returning 30583 1726853743.73418: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001840] 30583 1726853743.73428: sending task result for task 02083763-bbaf-05ea-abc5-000000001840 30583 1726853743.73677: done sending task result for task 02083763-bbaf-05ea-abc5-000000001840 30583 1726853743.73681: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853743.73736: no more pending results, returning what we have 30583 1726853743.73740: results queue empty 30583 1726853743.73742: checking for any_errors_fatal 30583 1726853743.73749: done checking for any_errors_fatal 30583 1726853743.73750: checking for max_fail_percentage 30583 1726853743.73752: done checking for max_fail_percentage 30583 1726853743.73754: checking to see if all hosts have failed and the running result is not ok 30583 1726853743.73754: done checking to see if all hosts have failed 30583 1726853743.73755: getting the remaining hosts for this loop 30583 1726853743.73757: done getting the remaining hosts for this loop 30583 1726853743.73762: getting the next task for host managed_node2 30583 1726853743.73774: done getting next task for host managed_node2 30583 1726853743.73779: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853743.73784: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853743.73808: getting variables 30583 1726853743.73810: in VariableManager get_vars() 30583 1726853743.73855: Calling all_inventory to load vars for managed_node2 30583 1726853743.73858: Calling groups_inventory to load vars for managed_node2 30583 1726853743.73861: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853743.73873: Calling all_plugins_play to load vars for managed_node2 30583 1726853743.73878: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853743.73882: Calling groups_plugins_play to load vars for managed_node2 30583 1726853743.75577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853743.77152: done with get_vars() 30583 1726853743.77183: done getting variables 30583 1726853743.77248: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:43 -0400 (0:00:00.088) 0:01:19.110 ****** 30583 1726853743.77291: entering _queue_task() for managed_node2/fail 30583 1726853743.77656: worker is 1 (out of 1 available) 30583 1726853743.77669: exiting _queue_task() for managed_node2/fail 30583 1726853743.77884: done queuing things up, now waiting for results queue to drain 30583 1726853743.77886: waiting for pending results... 30583 1726853743.77993: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853743.78131: in run() - task 02083763-bbaf-05ea-abc5-000000001841 30583 1726853743.78149: variable 'ansible_search_path' from source: unknown 30583 1726853743.78177: variable 'ansible_search_path' from source: unknown 30583 1726853743.78199: calling self._execute() 30583 1726853743.78304: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853743.78330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853743.78338: variable 'omit' from source: magic vars 30583 1726853743.78766: variable 'ansible_distribution_major_version' from source: facts 30583 1726853743.78876: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853743.78904: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853743.79115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853743.81393: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853743.81465: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853743.81511: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853743.81550: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853743.81586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853743.81666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.81722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.81752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.81799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.81822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.81915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.81918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.81935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.81980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.82000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.82048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.82078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.82107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.82177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.82180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.82359: variable 'network_connections' from source: include params 30583 1726853743.82379: variable 'interface' from source: play vars 30583 1726853743.82451: variable 'interface' from source: play vars 30583 1726853743.82532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853743.82709: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853743.82877: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853743.82880: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853743.82882: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853743.82885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853743.82887: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853743.82915: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.82945: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853743.83017: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853743.83276: variable 'network_connections' from source: include params 30583 1726853743.83287: variable 'interface' from source: play vars 30583 1726853743.83358: variable 'interface' from source: play vars 30583 1726853743.83397: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853743.83406: when evaluation is False, skipping this task 30583 1726853743.83413: _execute() done 30583 1726853743.83420: dumping result to json 30583 1726853743.83427: done dumping result, returning 30583 1726853743.83444: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001841] 30583 1726853743.83454: sending task result for task 02083763-bbaf-05ea-abc5-000000001841 30583 1726853743.83615: done sending task result for task 02083763-bbaf-05ea-abc5-000000001841 30583 1726853743.83619: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853743.83706: no more pending results, returning what we have 30583 1726853743.83711: results queue empty 30583 1726853743.83712: checking for any_errors_fatal 30583 1726853743.83720: done checking for any_errors_fatal 30583 1726853743.83721: checking for max_fail_percentage 30583 1726853743.83723: done checking for max_fail_percentage 30583 1726853743.83724: checking to see if all hosts have failed and the running result is not ok 30583 1726853743.83725: done checking to see if all hosts have failed 30583 1726853743.83726: getting the remaining hosts for this loop 30583 1726853743.83728: done getting the remaining hosts for this loop 30583 1726853743.83733: getting the next task for host managed_node2 30583 1726853743.83743: done getting next task for host managed_node2 30583 1726853743.83747: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853743.83753: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853743.83778: getting variables 30583 1726853743.83781: in VariableManager get_vars() 30583 1726853743.83826: Calling all_inventory to load vars for managed_node2 30583 1726853743.83829: Calling groups_inventory to load vars for managed_node2 30583 1726853743.83832: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853743.83843: Calling all_plugins_play to load vars for managed_node2 30583 1726853743.83847: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853743.83850: Calling groups_plugins_play to load vars for managed_node2 30583 1726853743.85467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853743.87444: done with get_vars() 30583 1726853743.87466: done getting variables 30583 1726853743.87525: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:43 -0400 (0:00:00.102) 0:01:19.212 ****** 30583 1726853743.87563: entering _queue_task() for managed_node2/package 30583 1726853743.88074: worker is 1 (out of 1 available) 30583 1726853743.88133: exiting _queue_task() for managed_node2/package 30583 1726853743.88145: done queuing things up, now waiting for results queue to drain 30583 1726853743.88146: waiting for pending results... 30583 1726853743.88432: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853743.88639: in run() - task 02083763-bbaf-05ea-abc5-000000001842 30583 1726853743.88643: variable 'ansible_search_path' from source: unknown 30583 1726853743.88647: variable 'ansible_search_path' from source: unknown 30583 1726853743.88650: calling self._execute() 30583 1726853743.88722: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853743.88734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853743.88757: variable 'omit' from source: magic vars 30583 1726853743.89143: variable 'ansible_distribution_major_version' from source: facts 30583 1726853743.89161: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853743.89372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853743.89664: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853743.89724: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853743.89833: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853743.89843: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853743.90034: variable 'network_packages' from source: role '' defaults 30583 1726853743.90376: variable '__network_provider_setup' from source: role '' defaults 30583 1726853743.90876: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853743.90879: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853743.90882: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853743.90884: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853743.91494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853743.94856: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853743.94923: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853743.94957: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853743.94997: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853743.95022: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853743.95109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.95137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.95163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.95210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.95224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.95268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.95293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.95321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.95360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.95379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.95617: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853743.95732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.95760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.95787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.95845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.95869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.95978: variable 'ansible_python' from source: facts 30583 1726853743.95994: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853743.96063: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853743.96518: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853743.96521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.96524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.96776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.96779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.96781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.96784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853743.96795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853743.96797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.96800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853743.96802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853743.96920: variable 'network_connections' from source: include params 30583 1726853743.96923: variable 'interface' from source: play vars 30583 1726853743.97029: variable 'interface' from source: play vars 30583 1726853743.97124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853743.97151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853743.97188: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853743.97222: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853743.97273: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853743.97611: variable 'network_connections' from source: include params 30583 1726853743.97614: variable 'interface' from source: play vars 30583 1726853743.97737: variable 'interface' from source: play vars 30583 1726853743.97837: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853743.97966: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853743.98316: variable 'network_connections' from source: include params 30583 1726853743.98320: variable 'interface' from source: play vars 30583 1726853743.98381: variable 'interface' from source: play vars 30583 1726853743.98407: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853743.98490: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853743.98810: variable 'network_connections' from source: include params 30583 1726853743.98813: variable 'interface' from source: play vars 30583 1726853743.98883: variable 'interface' from source: play vars 30583 1726853743.98950: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853743.99023: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853743.99029: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853743.99098: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853743.99325: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853743.99825: variable 'network_connections' from source: include params 30583 1726853743.99835: variable 'interface' from source: play vars 30583 1726853743.99900: variable 'interface' from source: play vars 30583 1726853743.99903: variable 'ansible_distribution' from source: facts 30583 1726853743.99906: variable '__network_rh_distros' from source: role '' defaults 30583 1726853743.99927: variable 'ansible_distribution_major_version' from source: facts 30583 1726853743.99981: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853744.00415: variable 'ansible_distribution' from source: facts 30583 1726853744.00418: variable '__network_rh_distros' from source: role '' defaults 30583 1726853744.00421: variable 'ansible_distribution_major_version' from source: facts 30583 1726853744.00431: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853744.00722: variable 'ansible_distribution' from source: facts 30583 1726853744.00726: variable '__network_rh_distros' from source: role '' defaults 30583 1726853744.00728: variable 'ansible_distribution_major_version' from source: facts 30583 1726853744.00748: variable 'network_provider' from source: set_fact 30583 1726853744.00767: variable 'ansible_facts' from source: unknown 30583 1726853744.01921: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853744.01925: when evaluation is False, skipping this task 30583 1726853744.01927: _execute() done 30583 1726853744.01930: dumping result to json 30583 1726853744.01931: done dumping result, returning 30583 1726853744.01943: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-000000001842] 30583 1726853744.01945: sending task result for task 02083763-bbaf-05ea-abc5-000000001842 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853744.02202: no more pending results, returning what we have 30583 1726853744.02206: results queue empty 30583 1726853744.02207: checking for any_errors_fatal 30583 1726853744.02213: done checking for any_errors_fatal 30583 1726853744.02214: checking for max_fail_percentage 30583 1726853744.02216: done checking for max_fail_percentage 30583 1726853744.02217: checking to see if all hosts have failed and the running result is not ok 30583 1726853744.02218: done checking to see if all hosts have failed 30583 1726853744.02218: getting the remaining hosts for this loop 30583 1726853744.02220: done getting the remaining hosts for this loop 30583 1726853744.02224: getting the next task for host managed_node2 30583 1726853744.02232: done getting next task for host managed_node2 30583 1726853744.02237: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853744.02242: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853744.02264: getting variables 30583 1726853744.02265: in VariableManager get_vars() 30583 1726853744.02307: Calling all_inventory to load vars for managed_node2 30583 1726853744.02310: Calling groups_inventory to load vars for managed_node2 30583 1726853744.02317: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853744.02326: Calling all_plugins_play to load vars for managed_node2 30583 1726853744.02328: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853744.02330: Calling groups_plugins_play to load vars for managed_node2 30583 1726853744.02888: done sending task result for task 02083763-bbaf-05ea-abc5-000000001842 30583 1726853744.02892: WORKER PROCESS EXITING 30583 1726853744.03216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853744.04951: done with get_vars() 30583 1726853744.04993: done getting variables 30583 1726853744.05050: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:44 -0400 (0:00:00.175) 0:01:19.388 ****** 30583 1726853744.05091: entering _queue_task() for managed_node2/package 30583 1726853744.05411: worker is 1 (out of 1 available) 30583 1726853744.05426: exiting _queue_task() for managed_node2/package 30583 1726853744.05440: done queuing things up, now waiting for results queue to drain 30583 1726853744.05442: waiting for pending results... 30583 1726853744.05788: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853744.05863: in run() - task 02083763-bbaf-05ea-abc5-000000001843 30583 1726853744.05887: variable 'ansible_search_path' from source: unknown 30583 1726853744.05896: variable 'ansible_search_path' from source: unknown 30583 1726853744.05944: calling self._execute() 30583 1726853744.06047: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853744.06063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853744.06080: variable 'omit' from source: magic vars 30583 1726853744.06582: variable 'ansible_distribution_major_version' from source: facts 30583 1726853744.06586: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853744.06695: variable 'network_state' from source: role '' defaults 30583 1726853744.06711: Evaluated conditional (network_state != {}): False 30583 1726853744.06719: when evaluation is False, skipping this task 30583 1726853744.06727: _execute() done 30583 1726853744.06735: dumping result to json 30583 1726853744.06742: done dumping result, returning 30583 1726853744.06762: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000001843] 30583 1726853744.06775: sending task result for task 02083763-bbaf-05ea-abc5-000000001843 30583 1726853744.07002: done sending task result for task 02083763-bbaf-05ea-abc5-000000001843 30583 1726853744.07006: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853744.07056: no more pending results, returning what we have 30583 1726853744.07063: results queue empty 30583 1726853744.07064: checking for any_errors_fatal 30583 1726853744.07075: done checking for any_errors_fatal 30583 1726853744.07076: checking for max_fail_percentage 30583 1726853744.07078: done checking for max_fail_percentage 30583 1726853744.07079: checking to see if all hosts have failed and the running result is not ok 30583 1726853744.07080: done checking to see if all hosts have failed 30583 1726853744.07081: getting the remaining hosts for this loop 30583 1726853744.07082: done getting the remaining hosts for this loop 30583 1726853744.07086: getting the next task for host managed_node2 30583 1726853744.07095: done getting next task for host managed_node2 30583 1726853744.07101: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853744.07105: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853744.07125: getting variables 30583 1726853744.07126: in VariableManager get_vars() 30583 1726853744.07164: Calling all_inventory to load vars for managed_node2 30583 1726853744.07167: Calling groups_inventory to load vars for managed_node2 30583 1726853744.07169: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853744.07249: Calling all_plugins_play to load vars for managed_node2 30583 1726853744.07253: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853744.07256: Calling groups_plugins_play to load vars for managed_node2 30583 1726853744.08632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853744.10069: done with get_vars() 30583 1726853744.10097: done getting variables 30583 1726853744.10162: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:44 -0400 (0:00:00.051) 0:01:19.439 ****** 30583 1726853744.10199: entering _queue_task() for managed_node2/package 30583 1726853744.10527: worker is 1 (out of 1 available) 30583 1726853744.10541: exiting _queue_task() for managed_node2/package 30583 1726853744.10552: done queuing things up, now waiting for results queue to drain 30583 1726853744.10554: waiting for pending results... 30583 1726853744.10982: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853744.11044: in run() - task 02083763-bbaf-05ea-abc5-000000001844 30583 1726853744.11069: variable 'ansible_search_path' from source: unknown 30583 1726853744.11085: variable 'ansible_search_path' from source: unknown 30583 1726853744.11129: calling self._execute() 30583 1726853744.11236: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853744.11248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853744.11266: variable 'omit' from source: magic vars 30583 1726853744.11757: variable 'ansible_distribution_major_version' from source: facts 30583 1726853744.11764: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853744.11854: variable 'network_state' from source: role '' defaults 30583 1726853744.11868: Evaluated conditional (network_state != {}): False 30583 1726853744.11873: when evaluation is False, skipping this task 30583 1726853744.11876: _execute() done 30583 1726853744.11879: dumping result to json 30583 1726853744.11881: done dumping result, returning 30583 1726853744.11889: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000001844] 30583 1726853744.11894: sending task result for task 02083763-bbaf-05ea-abc5-000000001844 30583 1726853744.11989: done sending task result for task 02083763-bbaf-05ea-abc5-000000001844 30583 1726853744.11992: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853744.12043: no more pending results, returning what we have 30583 1726853744.12047: results queue empty 30583 1726853744.12048: checking for any_errors_fatal 30583 1726853744.12056: done checking for any_errors_fatal 30583 1726853744.12057: checking for max_fail_percentage 30583 1726853744.12061: done checking for max_fail_percentage 30583 1726853744.12062: checking to see if all hosts have failed and the running result is not ok 30583 1726853744.12063: done checking to see if all hosts have failed 30583 1726853744.12064: getting the remaining hosts for this loop 30583 1726853744.12065: done getting the remaining hosts for this loop 30583 1726853744.12069: getting the next task for host managed_node2 30583 1726853744.12080: done getting next task for host managed_node2 30583 1726853744.12084: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853744.12090: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853744.12118: getting variables 30583 1726853744.12120: in VariableManager get_vars() 30583 1726853744.12157: Calling all_inventory to load vars for managed_node2 30583 1726853744.12162: Calling groups_inventory to load vars for managed_node2 30583 1726853744.12165: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853744.12176: Calling all_plugins_play to load vars for managed_node2 30583 1726853744.12179: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853744.12181: Calling groups_plugins_play to load vars for managed_node2 30583 1726853744.12981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853744.14297: done with get_vars() 30583 1726853744.14334: done getting variables 30583 1726853744.14396: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:44 -0400 (0:00:00.042) 0:01:19.481 ****** 30583 1726853744.14426: entering _queue_task() for managed_node2/service 30583 1726853744.14696: worker is 1 (out of 1 available) 30583 1726853744.14713: exiting _queue_task() for managed_node2/service 30583 1726853744.14726: done queuing things up, now waiting for results queue to drain 30583 1726853744.14728: waiting for pending results... 30583 1726853744.14931: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853744.15043: in run() - task 02083763-bbaf-05ea-abc5-000000001845 30583 1726853744.15054: variable 'ansible_search_path' from source: unknown 30583 1726853744.15060: variable 'ansible_search_path' from source: unknown 30583 1726853744.15092: calling self._execute() 30583 1726853744.15172: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853744.15176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853744.15185: variable 'omit' from source: magic vars 30583 1726853744.15470: variable 'ansible_distribution_major_version' from source: facts 30583 1726853744.15481: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853744.15566: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853744.15702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853744.17487: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853744.17531: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853744.17556: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853744.17587: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853744.17608: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853744.17668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853744.17693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853744.17710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.17737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853744.17748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853744.17788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853744.17802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853744.17818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.17846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853744.17856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853744.17887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853744.17905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853744.17921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.17946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853744.17956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853744.18075: variable 'network_connections' from source: include params 30583 1726853744.18084: variable 'interface' from source: play vars 30583 1726853744.18133: variable 'interface' from source: play vars 30583 1726853744.18188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853744.23181: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853744.23217: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853744.23240: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853744.23263: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853744.23301: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853744.23316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853744.23335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.23353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853744.23397: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853744.23555: variable 'network_connections' from source: include params 30583 1726853744.23559: variable 'interface' from source: play vars 30583 1726853744.23611: variable 'interface' from source: play vars 30583 1726853744.23635: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853744.23638: when evaluation is False, skipping this task 30583 1726853744.23641: _execute() done 30583 1726853744.23643: dumping result to json 30583 1726853744.23645: done dumping result, returning 30583 1726853744.23651: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001845] 30583 1726853744.23655: sending task result for task 02083763-bbaf-05ea-abc5-000000001845 30583 1726853744.23742: done sending task result for task 02083763-bbaf-05ea-abc5-000000001845 30583 1726853744.23752: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853744.23809: no more pending results, returning what we have 30583 1726853744.23812: results queue empty 30583 1726853744.23813: checking for any_errors_fatal 30583 1726853744.23819: done checking for any_errors_fatal 30583 1726853744.23820: checking for max_fail_percentage 30583 1726853744.23822: done checking for max_fail_percentage 30583 1726853744.23823: checking to see if all hosts have failed and the running result is not ok 30583 1726853744.23824: done checking to see if all hosts have failed 30583 1726853744.23824: getting the remaining hosts for this loop 30583 1726853744.23826: done getting the remaining hosts for this loop 30583 1726853744.23830: getting the next task for host managed_node2 30583 1726853744.23838: done getting next task for host managed_node2 30583 1726853744.23841: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853744.23845: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853744.23866: getting variables 30583 1726853744.23868: in VariableManager get_vars() 30583 1726853744.23906: Calling all_inventory to load vars for managed_node2 30583 1726853744.23909: Calling groups_inventory to load vars for managed_node2 30583 1726853744.23911: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853744.23920: Calling all_plugins_play to load vars for managed_node2 30583 1726853744.23922: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853744.23924: Calling groups_plugins_play to load vars for managed_node2 30583 1726853744.29187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853744.30034: done with get_vars() 30583 1726853744.30059: done getting variables 30583 1726853744.30098: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:44 -0400 (0:00:00.156) 0:01:19.638 ****** 30583 1726853744.30120: entering _queue_task() for managed_node2/service 30583 1726853744.30410: worker is 1 (out of 1 available) 30583 1726853744.30425: exiting _queue_task() for managed_node2/service 30583 1726853744.30438: done queuing things up, now waiting for results queue to drain 30583 1726853744.30441: waiting for pending results... 30583 1726853744.30649: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853744.30777: in run() - task 02083763-bbaf-05ea-abc5-000000001846 30583 1726853744.30793: variable 'ansible_search_path' from source: unknown 30583 1726853744.30797: variable 'ansible_search_path' from source: unknown 30583 1726853744.30825: calling self._execute() 30583 1726853744.30905: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853744.30910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853744.30918: variable 'omit' from source: magic vars 30583 1726853744.31217: variable 'ansible_distribution_major_version' from source: facts 30583 1726853744.31228: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853744.31339: variable 'network_provider' from source: set_fact 30583 1726853744.31344: variable 'network_state' from source: role '' defaults 30583 1726853744.31354: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853744.31360: variable 'omit' from source: magic vars 30583 1726853744.31409: variable 'omit' from source: magic vars 30583 1726853744.31428: variable 'network_service_name' from source: role '' defaults 30583 1726853744.31482: variable 'network_service_name' from source: role '' defaults 30583 1726853744.31550: variable '__network_provider_setup' from source: role '' defaults 30583 1726853744.31565: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853744.31612: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853744.31619: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853744.31670: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853744.31816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853744.33276: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853744.33334: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853744.33363: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853744.33390: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853744.33411: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853744.33472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853744.33493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853744.33513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.33539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853744.33550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853744.33584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853744.33600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853744.33620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.33644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853744.33654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853744.33808: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853744.33888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853744.33904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853744.33921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.33948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853744.33961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853744.34022: variable 'ansible_python' from source: facts 30583 1726853744.34034: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853744.34094: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853744.34148: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853744.34231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853744.34249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853744.34272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.34296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853744.34306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853744.34338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853744.34360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853744.34378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.34405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853744.34415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853744.34507: variable 'network_connections' from source: include params 30583 1726853744.34513: variable 'interface' from source: play vars 30583 1726853744.34563: variable 'interface' from source: play vars 30583 1726853744.34645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853744.34780: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853744.34820: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853744.34850: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853744.34882: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853744.34929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853744.34947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853744.34970: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.34995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853744.35033: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853744.35229: variable 'network_connections' from source: include params 30583 1726853744.35233: variable 'interface' from source: play vars 30583 1726853744.35293: variable 'interface' from source: play vars 30583 1726853744.35325: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853744.35383: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853744.35569: variable 'network_connections' from source: include params 30583 1726853744.35582: variable 'interface' from source: play vars 30583 1726853744.35624: variable 'interface' from source: play vars 30583 1726853744.35641: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853744.35700: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853744.35886: variable 'network_connections' from source: include params 30583 1726853744.35889: variable 'interface' from source: play vars 30583 1726853744.35941: variable 'interface' from source: play vars 30583 1726853744.35986: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853744.36029: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853744.36035: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853744.36080: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853744.36216: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853744.36542: variable 'network_connections' from source: include params 30583 1726853744.36545: variable 'interface' from source: play vars 30583 1726853744.36593: variable 'interface' from source: play vars 30583 1726853744.36600: variable 'ansible_distribution' from source: facts 30583 1726853744.36603: variable '__network_rh_distros' from source: role '' defaults 30583 1726853744.36609: variable 'ansible_distribution_major_version' from source: facts 30583 1726853744.36632: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853744.36744: variable 'ansible_distribution' from source: facts 30583 1726853744.36747: variable '__network_rh_distros' from source: role '' defaults 30583 1726853744.36750: variable 'ansible_distribution_major_version' from source: facts 30583 1726853744.36759: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853744.36873: variable 'ansible_distribution' from source: facts 30583 1726853744.36876: variable '__network_rh_distros' from source: role '' defaults 30583 1726853744.36879: variable 'ansible_distribution_major_version' from source: facts 30583 1726853744.36906: variable 'network_provider' from source: set_fact 30583 1726853744.36924: variable 'omit' from source: magic vars 30583 1726853744.36945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853744.36969: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853744.36985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853744.37001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853744.37009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853744.37031: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853744.37034: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853744.37037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853744.37176: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853744.37180: Set connection var ansible_timeout to 10 30583 1726853744.37182: Set connection var ansible_connection to ssh 30583 1726853744.37184: Set connection var ansible_shell_executable to /bin/sh 30583 1726853744.37186: Set connection var ansible_shell_type to sh 30583 1726853744.37188: Set connection var ansible_pipelining to False 30583 1726853744.37191: variable 'ansible_shell_executable' from source: unknown 30583 1726853744.37193: variable 'ansible_connection' from source: unknown 30583 1726853744.37195: variable 'ansible_module_compression' from source: unknown 30583 1726853744.37197: variable 'ansible_shell_type' from source: unknown 30583 1726853744.37199: variable 'ansible_shell_executable' from source: unknown 30583 1726853744.37201: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853744.37203: variable 'ansible_pipelining' from source: unknown 30583 1726853744.37205: variable 'ansible_timeout' from source: unknown 30583 1726853744.37209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853744.37239: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853744.37247: variable 'omit' from source: magic vars 30583 1726853744.37253: starting attempt loop 30583 1726853744.37256: running the handler 30583 1726853744.37313: variable 'ansible_facts' from source: unknown 30583 1726853744.37784: _low_level_execute_command(): starting 30583 1726853744.37789: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853744.38285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853744.38288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853744.38291: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853744.38293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853744.38344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853744.38347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853744.38349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853744.38439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853744.40198: stdout chunk (state=3): >>>/root <<< 30583 1726853744.40298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853744.40347: stderr chunk (state=3): >>><<< 30583 1726853744.40351: stdout chunk (state=3): >>><<< 30583 1726853744.40370: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853744.40460: _low_level_execute_command(): starting 30583 1726853744.40465: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437 `" && echo ansible-tmp-1726853744.403811-34287-160803325166437="` echo /root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437 `" ) && sleep 0' 30583 1726853744.41182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853744.41185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853744.41188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853744.41191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853744.41194: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853744.41196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853744.41198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853744.41200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853744.41280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853744.41283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853744.41378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853744.43409: stdout chunk (state=3): >>>ansible-tmp-1726853744.403811-34287-160803325166437=/root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437 <<< 30583 1726853744.43519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853744.43542: stderr chunk (state=3): >>><<< 30583 1726853744.43545: stdout chunk (state=3): >>><<< 30583 1726853744.43563: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853744.403811-34287-160803325166437=/root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853744.43590: variable 'ansible_module_compression' from source: unknown 30583 1726853744.43627: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853744.43674: variable 'ansible_facts' from source: unknown 30583 1726853744.43809: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437/AnsiballZ_systemd.py 30583 1726853744.43911: Sending initial data 30583 1726853744.43915: Sent initial data (155 bytes) 30583 1726853744.44347: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853744.44354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853744.44356: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853744.44361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853744.44364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853744.44405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853744.44411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853744.44414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853744.44483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853744.46151: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30583 1726853744.46155: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853744.46217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853744.46291: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp67r7ub3c /root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437/AnsiballZ_systemd.py <<< 30583 1726853744.46294: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437/AnsiballZ_systemd.py" <<< 30583 1726853744.46360: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp67r7ub3c" to remote "/root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437/AnsiballZ_systemd.py" <<< 30583 1726853744.47530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853744.47567: stderr chunk (state=3): >>><<< 30583 1726853744.47572: stdout chunk (state=3): >>><<< 30583 1726853744.47590: done transferring module to remote 30583 1726853744.47597: _low_level_execute_command(): starting 30583 1726853744.47602: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437/ /root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437/AnsiballZ_systemd.py && sleep 0' 30583 1726853744.48029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853744.48032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853744.48034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853744.48036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853744.48038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853744.48076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853744.48092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853744.48158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853744.50031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853744.50054: stderr chunk (state=3): >>><<< 30583 1726853744.50061: stdout chunk (state=3): >>><<< 30583 1726853744.50077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853744.50081: _low_level_execute_command(): starting 30583 1726853744.50085: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437/AnsiballZ_systemd.py && sleep 0' 30583 1726853744.50515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853744.50518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853744.50520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853744.50523: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853744.50574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853744.50577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853744.50580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853744.50656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853744.80264: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4603904", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311157248", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1920283000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 30583 1726853744.80297: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853744.82373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853744.82443: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853744.82447: stdout chunk (state=3): >>><<< 30583 1726853744.82450: stderr chunk (state=3): >>><<< 30583 1726853744.82681: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4603904", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311157248", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1920283000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853744.82692: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853744.82710: _low_level_execute_command(): starting 30583 1726853744.82721: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853744.403811-34287-160803325166437/ > /dev/null 2>&1 && sleep 0' 30583 1726853744.83342: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853744.83363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853744.83379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853744.83395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853744.83409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853744.83418: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853744.83429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853744.83445: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853744.83473: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853744.83551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853744.83585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853744.83681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853744.85677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853744.85680: stdout chunk (state=3): >>><<< 30583 1726853744.85683: stderr chunk (state=3): >>><<< 30583 1726853744.85703: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853744.85716: handler run complete 30583 1726853744.85789: attempt loop complete, returning result 30583 1726853744.85878: _execute() done 30583 1726853744.85881: dumping result to json 30583 1726853744.85883: done dumping result, returning 30583 1726853744.85885: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-000000001846] 30583 1726853744.85887: sending task result for task 02083763-bbaf-05ea-abc5-000000001846 30583 1726853744.86299: done sending task result for task 02083763-bbaf-05ea-abc5-000000001846 30583 1726853744.86304: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853744.86356: no more pending results, returning what we have 30583 1726853744.86362: results queue empty 30583 1726853744.86364: checking for any_errors_fatal 30583 1726853744.86374: done checking for any_errors_fatal 30583 1726853744.86375: checking for max_fail_percentage 30583 1726853744.86377: done checking for max_fail_percentage 30583 1726853744.86378: checking to see if all hosts have failed and the running result is not ok 30583 1726853744.86378: done checking to see if all hosts have failed 30583 1726853744.86379: getting the remaining hosts for this loop 30583 1726853744.86385: done getting the remaining hosts for this loop 30583 1726853744.86388: getting the next task for host managed_node2 30583 1726853744.86395: done getting next task for host managed_node2 30583 1726853744.86398: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853744.86404: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853744.86417: getting variables 30583 1726853744.86418: in VariableManager get_vars() 30583 1726853744.86451: Calling all_inventory to load vars for managed_node2 30583 1726853744.86454: Calling groups_inventory to load vars for managed_node2 30583 1726853744.86456: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853744.86468: Calling all_plugins_play to load vars for managed_node2 30583 1726853744.86521: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853744.86527: Calling groups_plugins_play to load vars for managed_node2 30583 1726853744.87661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853744.88642: done with get_vars() 30583 1726853744.88660: done getting variables 30583 1726853744.88704: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:35:44 -0400 (0:00:00.586) 0:01:20.224 ****** 30583 1726853744.88734: entering _queue_task() for managed_node2/service 30583 1726853744.89000: worker is 1 (out of 1 available) 30583 1726853744.89012: exiting _queue_task() for managed_node2/service 30583 1726853744.89024: done queuing things up, now waiting for results queue to drain 30583 1726853744.89025: waiting for pending results... 30583 1726853744.89296: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853744.89404: in run() - task 02083763-bbaf-05ea-abc5-000000001847 30583 1726853744.89412: variable 'ansible_search_path' from source: unknown 30583 1726853744.89415: variable 'ansible_search_path' from source: unknown 30583 1726853744.89446: calling self._execute() 30583 1726853744.89525: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853744.89528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853744.89537: variable 'omit' from source: magic vars 30583 1726853744.89840: variable 'ansible_distribution_major_version' from source: facts 30583 1726853744.90079: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853744.90082: variable 'network_provider' from source: set_fact 30583 1726853744.90085: Evaluated conditional (network_provider == "nm"): True 30583 1726853744.90087: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853744.90152: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853744.90339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853744.91825: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853744.91875: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853744.91902: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853744.91929: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853744.91949: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853744.92029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853744.92055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853744.92075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.92101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853744.92112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853744.92148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853744.92165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853744.92183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.92207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853744.92217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853744.92245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853744.92264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853744.92283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.92307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853744.92318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853744.92417: variable 'network_connections' from source: include params 30583 1726853744.92428: variable 'interface' from source: play vars 30583 1726853744.92479: variable 'interface' from source: play vars 30583 1726853744.92526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853744.92634: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853744.92663: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853744.92685: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853744.92708: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853744.92738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853744.92753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853744.92774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853744.92791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853744.92829: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853744.92983: variable 'network_connections' from source: include params 30583 1726853744.92987: variable 'interface' from source: play vars 30583 1726853744.93030: variable 'interface' from source: play vars 30583 1726853744.93062: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853744.93065: when evaluation is False, skipping this task 30583 1726853744.93068: _execute() done 30583 1726853744.93070: dumping result to json 30583 1726853744.93074: done dumping result, returning 30583 1726853744.93079: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-000000001847] 30583 1726853744.93089: sending task result for task 02083763-bbaf-05ea-abc5-000000001847 30583 1726853744.93177: done sending task result for task 02083763-bbaf-05ea-abc5-000000001847 30583 1726853744.93179: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853744.93226: no more pending results, returning what we have 30583 1726853744.93229: results queue empty 30583 1726853744.93230: checking for any_errors_fatal 30583 1726853744.93251: done checking for any_errors_fatal 30583 1726853744.93252: checking for max_fail_percentage 30583 1726853744.93254: done checking for max_fail_percentage 30583 1726853744.93255: checking to see if all hosts have failed and the running result is not ok 30583 1726853744.93256: done checking to see if all hosts have failed 30583 1726853744.93257: getting the remaining hosts for this loop 30583 1726853744.93261: done getting the remaining hosts for this loop 30583 1726853744.93265: getting the next task for host managed_node2 30583 1726853744.93274: done getting next task for host managed_node2 30583 1726853744.93278: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853744.93283: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853744.93302: getting variables 30583 1726853744.93304: in VariableManager get_vars() 30583 1726853744.93342: Calling all_inventory to load vars for managed_node2 30583 1726853744.93345: Calling groups_inventory to load vars for managed_node2 30583 1726853744.93347: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853744.93356: Calling all_plugins_play to load vars for managed_node2 30583 1726853744.93361: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853744.93364: Calling groups_plugins_play to load vars for managed_node2 30583 1726853744.94172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853744.95083: done with get_vars() 30583 1726853744.95099: done getting variables 30583 1726853744.95141: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:35:44 -0400 (0:00:00.064) 0:01:20.288 ****** 30583 1726853744.95167: entering _queue_task() for managed_node2/service 30583 1726853744.95405: worker is 1 (out of 1 available) 30583 1726853744.95420: exiting _queue_task() for managed_node2/service 30583 1726853744.95433: done queuing things up, now waiting for results queue to drain 30583 1726853744.95435: waiting for pending results... 30583 1726853744.95618: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853744.95716: in run() - task 02083763-bbaf-05ea-abc5-000000001848 30583 1726853744.95727: variable 'ansible_search_path' from source: unknown 30583 1726853744.95731: variable 'ansible_search_path' from source: unknown 30583 1726853744.95762: calling self._execute() 30583 1726853744.95838: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853744.95843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853744.95851: variable 'omit' from source: magic vars 30583 1726853744.96163: variable 'ansible_distribution_major_version' from source: facts 30583 1726853744.96376: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853744.96379: variable 'network_provider' from source: set_fact 30583 1726853744.96381: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853744.96384: when evaluation is False, skipping this task 30583 1726853744.96386: _execute() done 30583 1726853744.96389: dumping result to json 30583 1726853744.96391: done dumping result, returning 30583 1726853744.96397: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-000000001848] 30583 1726853744.96399: sending task result for task 02083763-bbaf-05ea-abc5-000000001848 30583 1726853744.96466: done sending task result for task 02083763-bbaf-05ea-abc5-000000001848 30583 1726853744.96469: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853744.96540: no more pending results, returning what we have 30583 1726853744.96543: results queue empty 30583 1726853744.96544: checking for any_errors_fatal 30583 1726853744.96549: done checking for any_errors_fatal 30583 1726853744.96550: checking for max_fail_percentage 30583 1726853744.96552: done checking for max_fail_percentage 30583 1726853744.96552: checking to see if all hosts have failed and the running result is not ok 30583 1726853744.96553: done checking to see if all hosts have failed 30583 1726853744.96554: getting the remaining hosts for this loop 30583 1726853744.96555: done getting the remaining hosts for this loop 30583 1726853744.96560: getting the next task for host managed_node2 30583 1726853744.96567: done getting next task for host managed_node2 30583 1726853744.96570: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853744.96576: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853744.96594: getting variables 30583 1726853744.96595: in VariableManager get_vars() 30583 1726853744.96662: Calling all_inventory to load vars for managed_node2 30583 1726853744.96665: Calling groups_inventory to load vars for managed_node2 30583 1726853744.96667: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853744.96677: Calling all_plugins_play to load vars for managed_node2 30583 1726853744.96680: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853744.96682: Calling groups_plugins_play to load vars for managed_node2 30583 1726853744.97969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853744.98952: done with get_vars() 30583 1726853744.98966: done getting variables 30583 1726853744.99010: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:35:44 -0400 (0:00:00.038) 0:01:20.327 ****** 30583 1726853744.99035: entering _queue_task() for managed_node2/copy 30583 1726853744.99269: worker is 1 (out of 1 available) 30583 1726853744.99285: exiting _queue_task() for managed_node2/copy 30583 1726853744.99297: done queuing things up, now waiting for results queue to drain 30583 1726853744.99299: waiting for pending results... 30583 1726853744.99501: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853744.99592: in run() - task 02083763-bbaf-05ea-abc5-000000001849 30583 1726853744.99602: variable 'ansible_search_path' from source: unknown 30583 1726853744.99605: variable 'ansible_search_path' from source: unknown 30583 1726853744.99639: calling self._execute() 30583 1726853744.99715: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853744.99720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853744.99728: variable 'omit' from source: magic vars 30583 1726853745.00024: variable 'ansible_distribution_major_version' from source: facts 30583 1726853745.00033: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853745.00121: variable 'network_provider' from source: set_fact 30583 1726853745.00125: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853745.00128: when evaluation is False, skipping this task 30583 1726853745.00131: _execute() done 30583 1726853745.00133: dumping result to json 30583 1726853745.00137: done dumping result, returning 30583 1726853745.00146: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-000000001849] 30583 1726853745.00149: sending task result for task 02083763-bbaf-05ea-abc5-000000001849 30583 1726853745.00238: done sending task result for task 02083763-bbaf-05ea-abc5-000000001849 30583 1726853745.00241: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853745.00288: no more pending results, returning what we have 30583 1726853745.00291: results queue empty 30583 1726853745.00293: checking for any_errors_fatal 30583 1726853745.00301: done checking for any_errors_fatal 30583 1726853745.00302: checking for max_fail_percentage 30583 1726853745.00303: done checking for max_fail_percentage 30583 1726853745.00304: checking to see if all hosts have failed and the running result is not ok 30583 1726853745.00305: done checking to see if all hosts have failed 30583 1726853745.00306: getting the remaining hosts for this loop 30583 1726853745.00308: done getting the remaining hosts for this loop 30583 1726853745.00311: getting the next task for host managed_node2 30583 1726853745.00319: done getting next task for host managed_node2 30583 1726853745.00323: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853745.00327: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853745.00347: getting variables 30583 1726853745.00351: in VariableManager get_vars() 30583 1726853745.00387: Calling all_inventory to load vars for managed_node2 30583 1726853745.00389: Calling groups_inventory to load vars for managed_node2 30583 1726853745.00391: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853745.00400: Calling all_plugins_play to load vars for managed_node2 30583 1726853745.00403: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853745.00405: Calling groups_plugins_play to load vars for managed_node2 30583 1726853745.01146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853745.02023: done with get_vars() 30583 1726853745.02037: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:35:45 -0400 (0:00:00.030) 0:01:20.358 ****** 30583 1726853745.02100: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853745.02314: worker is 1 (out of 1 available) 30583 1726853745.02328: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853745.02341: done queuing things up, now waiting for results queue to drain 30583 1726853745.02342: waiting for pending results... 30583 1726853745.02531: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853745.02636: in run() - task 02083763-bbaf-05ea-abc5-00000000184a 30583 1726853745.02648: variable 'ansible_search_path' from source: unknown 30583 1726853745.02652: variable 'ansible_search_path' from source: unknown 30583 1726853745.02684: calling self._execute() 30583 1726853745.02754: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.02761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.02766: variable 'omit' from source: magic vars 30583 1726853745.03043: variable 'ansible_distribution_major_version' from source: facts 30583 1726853745.03053: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853745.03061: variable 'omit' from source: magic vars 30583 1726853745.03104: variable 'omit' from source: magic vars 30583 1726853745.03217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853745.04906: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853745.04948: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853745.04977: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853745.05002: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853745.05021: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853745.05079: variable 'network_provider' from source: set_fact 30583 1726853745.05180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853745.05184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853745.05204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853745.05229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853745.05240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853745.05295: variable 'omit' from source: magic vars 30583 1726853745.05366: variable 'omit' from source: magic vars 30583 1726853745.05437: variable 'network_connections' from source: include params 30583 1726853745.05446: variable 'interface' from source: play vars 30583 1726853745.05491: variable 'interface' from source: play vars 30583 1726853745.05600: variable 'omit' from source: magic vars 30583 1726853745.05608: variable '__lsr_ansible_managed' from source: task vars 30583 1726853745.05649: variable '__lsr_ansible_managed' from source: task vars 30583 1726853745.05766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853745.05905: Loaded config def from plugin (lookup/template) 30583 1726853745.05909: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853745.05929: File lookup term: get_ansible_managed.j2 30583 1726853745.05932: variable 'ansible_search_path' from source: unknown 30583 1726853745.05937: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853745.05949: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853745.05976: variable 'ansible_search_path' from source: unknown 30583 1726853745.09147: variable 'ansible_managed' from source: unknown 30583 1726853745.09225: variable 'omit' from source: magic vars 30583 1726853745.09245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853745.09266: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853745.09283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853745.09297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853745.09305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853745.09327: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853745.09329: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.09332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.09395: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853745.09400: Set connection var ansible_timeout to 10 30583 1726853745.09402: Set connection var ansible_connection to ssh 30583 1726853745.09410: Set connection var ansible_shell_executable to /bin/sh 30583 1726853745.09412: Set connection var ansible_shell_type to sh 30583 1726853745.09418: Set connection var ansible_pipelining to False 30583 1726853745.09437: variable 'ansible_shell_executable' from source: unknown 30583 1726853745.09440: variable 'ansible_connection' from source: unknown 30583 1726853745.09442: variable 'ansible_module_compression' from source: unknown 30583 1726853745.09444: variable 'ansible_shell_type' from source: unknown 30583 1726853745.09446: variable 'ansible_shell_executable' from source: unknown 30583 1726853745.09450: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.09452: variable 'ansible_pipelining' from source: unknown 30583 1726853745.09454: variable 'ansible_timeout' from source: unknown 30583 1726853745.09461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.09548: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853745.09563: variable 'omit' from source: magic vars 30583 1726853745.09566: starting attempt loop 30583 1726853745.09568: running the handler 30583 1726853745.09580: _low_level_execute_command(): starting 30583 1726853745.09585: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853745.10076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853745.10080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.10083: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853745.10098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.10144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853745.10147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853745.10149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853745.10243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853745.11985: stdout chunk (state=3): >>>/root <<< 30583 1726853745.12083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853745.12114: stderr chunk (state=3): >>><<< 30583 1726853745.12118: stdout chunk (state=3): >>><<< 30583 1726853745.12136: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853745.12146: _low_level_execute_command(): starting 30583 1726853745.12151: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504 `" && echo ansible-tmp-1726853745.1213555-34331-255651633986504="` echo /root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504 `" ) && sleep 0' 30583 1726853745.12578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853745.12581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853745.12584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853745.12586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.12638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853745.12644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853745.12646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853745.12718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853745.14739: stdout chunk (state=3): >>>ansible-tmp-1726853745.1213555-34331-255651633986504=/root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504 <<< 30583 1726853745.14844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853745.14875: stderr chunk (state=3): >>><<< 30583 1726853745.14879: stdout chunk (state=3): >>><<< 30583 1726853745.14893: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853745.1213555-34331-255651633986504=/root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853745.14928: variable 'ansible_module_compression' from source: unknown 30583 1726853745.14966: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853745.14993: variable 'ansible_facts' from source: unknown 30583 1726853745.15060: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504/AnsiballZ_network_connections.py 30583 1726853745.15153: Sending initial data 30583 1726853745.15157: Sent initial data (168 bytes) 30583 1726853745.15596: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853745.15599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.15605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853745.15610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853745.15612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.15653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853745.15656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853745.15732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853745.17369: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30583 1726853745.17374: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853745.17438: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853745.17507: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpd600t_a1 /root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504/AnsiballZ_network_connections.py <<< 30583 1726853745.17510: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504/AnsiballZ_network_connections.py" <<< 30583 1726853745.17576: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpd600t_a1" to remote "/root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504/AnsiballZ_network_connections.py" <<< 30583 1726853745.17579: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504/AnsiballZ_network_connections.py" <<< 30583 1726853745.18391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853745.18432: stderr chunk (state=3): >>><<< 30583 1726853745.18435: stdout chunk (state=3): >>><<< 30583 1726853745.18470: done transferring module to remote 30583 1726853745.18481: _low_level_execute_command(): starting 30583 1726853745.18486: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504/ /root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504/AnsiballZ_network_connections.py && sleep 0' 30583 1726853745.18920: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853745.18923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853745.18925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.18927: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853745.18929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853745.18931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.18977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853745.18981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853745.18995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853745.19064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853745.20939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853745.20965: stderr chunk (state=3): >>><<< 30583 1726853745.20968: stdout chunk (state=3): >>><<< 30583 1726853745.20986: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853745.20989: _low_level_execute_command(): starting 30583 1726853745.20991: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504/AnsiballZ_network_connections.py && sleep 0' 30583 1726853745.21415: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853745.21419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853745.21421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.21423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853745.21425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.21479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853745.21485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853745.21561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853745.51580: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30583 1726853745.54168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853745.54175: stdout chunk (state=3): >>><<< 30583 1726853745.54181: stderr chunk (state=3): >>><<< 30583 1726853745.54199: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853745.54242: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853745.54252: _low_level_execute_command(): starting 30583 1726853745.54257: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853745.1213555-34331-255651633986504/ > /dev/null 2>&1 && sleep 0' 30583 1726853745.54914: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853745.54923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853745.54934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853745.55077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853745.55080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853745.55082: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853745.55085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.55087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853745.55089: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853745.55091: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853745.55093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853745.55095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853745.55097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.55131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853745.55143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853745.55160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853745.55266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853745.57278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853745.57282: stdout chunk (state=3): >>><<< 30583 1726853745.57284: stderr chunk (state=3): >>><<< 30583 1726853745.57398: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853745.57406: handler run complete 30583 1726853745.57409: attempt loop complete, returning result 30583 1726853745.57411: _execute() done 30583 1726853745.57414: dumping result to json 30583 1726853745.57416: done dumping result, returning 30583 1726853745.57418: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-00000000184a] 30583 1726853745.57420: sending task result for task 02083763-bbaf-05ea-abc5-00000000184a 30583 1726853745.57492: done sending task result for task 02083763-bbaf-05ea-abc5-00000000184a 30583 1726853745.57495: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954 30583 1726853745.57609: no more pending results, returning what we have 30583 1726853745.57613: results queue empty 30583 1726853745.57614: checking for any_errors_fatal 30583 1726853745.57619: done checking for any_errors_fatal 30583 1726853745.57620: checking for max_fail_percentage 30583 1726853745.57622: done checking for max_fail_percentage 30583 1726853745.57622: checking to see if all hosts have failed and the running result is not ok 30583 1726853745.57623: done checking to see if all hosts have failed 30583 1726853745.57624: getting the remaining hosts for this loop 30583 1726853745.57626: done getting the remaining hosts for this loop 30583 1726853745.57629: getting the next task for host managed_node2 30583 1726853745.57636: done getting next task for host managed_node2 30583 1726853745.57640: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853745.57645: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853745.57656: getting variables 30583 1726853745.57657: in VariableManager get_vars() 30583 1726853745.57862: Calling all_inventory to load vars for managed_node2 30583 1726853745.57866: Calling groups_inventory to load vars for managed_node2 30583 1726853745.57868: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853745.57883: Calling all_plugins_play to load vars for managed_node2 30583 1726853745.57886: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853745.57889: Calling groups_plugins_play to load vars for managed_node2 30583 1726853745.59378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853745.60933: done with get_vars() 30583 1726853745.60955: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:35:45 -0400 (0:00:00.589) 0:01:20.947 ****** 30583 1726853745.61045: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853745.61406: worker is 1 (out of 1 available) 30583 1726853745.61421: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853745.61433: done queuing things up, now waiting for results queue to drain 30583 1726853745.61434: waiting for pending results... 30583 1726853745.61861: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853745.61890: in run() - task 02083763-bbaf-05ea-abc5-00000000184b 30583 1726853745.61911: variable 'ansible_search_path' from source: unknown 30583 1726853745.61915: variable 'ansible_search_path' from source: unknown 30583 1726853745.61954: calling self._execute() 30583 1726853745.62051: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.62056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.62068: variable 'omit' from source: magic vars 30583 1726853745.62475: variable 'ansible_distribution_major_version' from source: facts 30583 1726853745.62486: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853745.62616: variable 'network_state' from source: role '' defaults 30583 1726853745.62626: Evaluated conditional (network_state != {}): False 30583 1726853745.62629: when evaluation is False, skipping this task 30583 1726853745.62632: _execute() done 30583 1726853745.62635: dumping result to json 30583 1726853745.62638: done dumping result, returning 30583 1726853745.62675: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-00000000184b] 30583 1726853745.62679: sending task result for task 02083763-bbaf-05ea-abc5-00000000184b 30583 1726853745.62917: done sending task result for task 02083763-bbaf-05ea-abc5-00000000184b 30583 1726853745.62921: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853745.62964: no more pending results, returning what we have 30583 1726853745.62968: results queue empty 30583 1726853745.62969: checking for any_errors_fatal 30583 1726853745.63043: done checking for any_errors_fatal 30583 1726853745.63045: checking for max_fail_percentage 30583 1726853745.63047: done checking for max_fail_percentage 30583 1726853745.63048: checking to see if all hosts have failed and the running result is not ok 30583 1726853745.63048: done checking to see if all hosts have failed 30583 1726853745.63049: getting the remaining hosts for this loop 30583 1726853745.63051: done getting the remaining hosts for this loop 30583 1726853745.63054: getting the next task for host managed_node2 30583 1726853745.63063: done getting next task for host managed_node2 30583 1726853745.63067: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853745.63073: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853745.63091: getting variables 30583 1726853745.63093: in VariableManager get_vars() 30583 1726853745.63128: Calling all_inventory to load vars for managed_node2 30583 1726853745.63130: Calling groups_inventory to load vars for managed_node2 30583 1726853745.63133: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853745.63141: Calling all_plugins_play to load vars for managed_node2 30583 1726853745.63144: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853745.63147: Calling groups_plugins_play to load vars for managed_node2 30583 1726853745.64426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853745.65989: done with get_vars() 30583 1726853745.66013: done getting variables 30583 1726853745.66077: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:35:45 -0400 (0:00:00.050) 0:01:20.998 ****** 30583 1726853745.66116: entering _queue_task() for managed_node2/debug 30583 1726853745.66479: worker is 1 (out of 1 available) 30583 1726853745.66492: exiting _queue_task() for managed_node2/debug 30583 1726853745.66504: done queuing things up, now waiting for results queue to drain 30583 1726853745.66506: waiting for pending results... 30583 1726853745.66817: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853745.66977: in run() - task 02083763-bbaf-05ea-abc5-00000000184c 30583 1726853745.66982: variable 'ansible_search_path' from source: unknown 30583 1726853745.66985: variable 'ansible_search_path' from source: unknown 30583 1726853745.67013: calling self._execute() 30583 1726853745.67114: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.67117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.67166: variable 'omit' from source: magic vars 30583 1726853745.67527: variable 'ansible_distribution_major_version' from source: facts 30583 1726853745.67542: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853745.67549: variable 'omit' from source: magic vars 30583 1726853745.67613: variable 'omit' from source: magic vars 30583 1726853745.67708: variable 'omit' from source: magic vars 30583 1726853745.67711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853745.67733: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853745.67754: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853745.67780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853745.67792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853745.67824: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853745.67827: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.67829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.67936: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853745.67979: Set connection var ansible_timeout to 10 30583 1726853745.67983: Set connection var ansible_connection to ssh 30583 1726853745.67985: Set connection var ansible_shell_executable to /bin/sh 30583 1726853745.67988: Set connection var ansible_shell_type to sh 30583 1726853745.67990: Set connection var ansible_pipelining to False 30583 1726853745.67992: variable 'ansible_shell_executable' from source: unknown 30583 1726853745.67997: variable 'ansible_connection' from source: unknown 30583 1726853745.67999: variable 'ansible_module_compression' from source: unknown 30583 1726853745.68002: variable 'ansible_shell_type' from source: unknown 30583 1726853745.68004: variable 'ansible_shell_executable' from source: unknown 30583 1726853745.68035: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.68038: variable 'ansible_pipelining' from source: unknown 30583 1726853745.68040: variable 'ansible_timeout' from source: unknown 30583 1726853745.68042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.68160: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853745.68217: variable 'omit' from source: magic vars 30583 1726853745.68221: starting attempt loop 30583 1726853745.68223: running the handler 30583 1726853745.68317: variable '__network_connections_result' from source: set_fact 30583 1726853745.68467: handler run complete 30583 1726853745.68470: attempt loop complete, returning result 30583 1726853745.68474: _execute() done 30583 1726853745.68476: dumping result to json 30583 1726853745.68478: done dumping result, returning 30583 1726853745.68480: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-00000000184c] 30583 1726853745.68481: sending task result for task 02083763-bbaf-05ea-abc5-00000000184c 30583 1726853745.68543: done sending task result for task 02083763-bbaf-05ea-abc5-00000000184c 30583 1726853745.68547: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954" ] } 30583 1726853745.68624: no more pending results, returning what we have 30583 1726853745.68628: results queue empty 30583 1726853745.68629: checking for any_errors_fatal 30583 1726853745.68637: done checking for any_errors_fatal 30583 1726853745.68638: checking for max_fail_percentage 30583 1726853745.68640: done checking for max_fail_percentage 30583 1726853745.68641: checking to see if all hosts have failed and the running result is not ok 30583 1726853745.68642: done checking to see if all hosts have failed 30583 1726853745.68642: getting the remaining hosts for this loop 30583 1726853745.68644: done getting the remaining hosts for this loop 30583 1726853745.68648: getting the next task for host managed_node2 30583 1726853745.68660: done getting next task for host managed_node2 30583 1726853745.68664: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853745.68669: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853745.68685: getting variables 30583 1726853745.68687: in VariableManager get_vars() 30583 1726853745.68730: Calling all_inventory to load vars for managed_node2 30583 1726853745.68733: Calling groups_inventory to load vars for managed_node2 30583 1726853745.68735: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853745.68746: Calling all_plugins_play to load vars for managed_node2 30583 1726853745.68749: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853745.68752: Calling groups_plugins_play to load vars for managed_node2 30583 1726853745.70453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853745.72015: done with get_vars() 30583 1726853745.72040: done getting variables 30583 1726853745.72108: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:35:45 -0400 (0:00:00.060) 0:01:21.058 ****** 30583 1726853745.72150: entering _queue_task() for managed_node2/debug 30583 1726853745.72513: worker is 1 (out of 1 available) 30583 1726853745.72527: exiting _queue_task() for managed_node2/debug 30583 1726853745.72539: done queuing things up, now waiting for results queue to drain 30583 1726853745.72540: waiting for pending results... 30583 1726853745.72936: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853745.73023: in run() - task 02083763-bbaf-05ea-abc5-00000000184d 30583 1726853745.73143: variable 'ansible_search_path' from source: unknown 30583 1726853745.73147: variable 'ansible_search_path' from source: unknown 30583 1726853745.73152: calling self._execute() 30583 1726853745.73189: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.73193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.73204: variable 'omit' from source: magic vars 30583 1726853745.73602: variable 'ansible_distribution_major_version' from source: facts 30583 1726853745.73613: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853745.73621: variable 'omit' from source: magic vars 30583 1726853745.73690: variable 'omit' from source: magic vars 30583 1726853745.73730: variable 'omit' from source: magic vars 30583 1726853745.73774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853745.73813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853745.73834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853745.73852: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853745.73867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853745.73902: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853745.73905: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.73908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.74121: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853745.74124: Set connection var ansible_timeout to 10 30583 1726853745.74127: Set connection var ansible_connection to ssh 30583 1726853745.74129: Set connection var ansible_shell_executable to /bin/sh 30583 1726853745.74131: Set connection var ansible_shell_type to sh 30583 1726853745.74133: Set connection var ansible_pipelining to False 30583 1726853745.74136: variable 'ansible_shell_executable' from source: unknown 30583 1726853745.74138: variable 'ansible_connection' from source: unknown 30583 1726853745.74141: variable 'ansible_module_compression' from source: unknown 30583 1726853745.74144: variable 'ansible_shell_type' from source: unknown 30583 1726853745.74146: variable 'ansible_shell_executable' from source: unknown 30583 1726853745.74149: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.74151: variable 'ansible_pipelining' from source: unknown 30583 1726853745.74154: variable 'ansible_timeout' from source: unknown 30583 1726853745.74156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.74245: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853745.74257: variable 'omit' from source: magic vars 30583 1726853745.74265: starting attempt loop 30583 1726853745.74269: running the handler 30583 1726853745.74319: variable '__network_connections_result' from source: set_fact 30583 1726853745.74399: variable '__network_connections_result' from source: set_fact 30583 1726853745.74523: handler run complete 30583 1726853745.74552: attempt loop complete, returning result 30583 1726853745.74556: _execute() done 30583 1726853745.74559: dumping result to json 30583 1726853745.74561: done dumping result, returning 30583 1726853745.74570: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-00000000184d] 30583 1726853745.74576: sending task result for task 02083763-bbaf-05ea-abc5-00000000184d ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954" ] } } 30583 1726853745.74814: done sending task result for task 02083763-bbaf-05ea-abc5-00000000184d 30583 1726853745.74832: no more pending results, returning what we have 30583 1726853745.74836: results queue empty 30583 1726853745.74837: checking for any_errors_fatal 30583 1726853745.74845: done checking for any_errors_fatal 30583 1726853745.74846: checking for max_fail_percentage 30583 1726853745.74848: done checking for max_fail_percentage 30583 1726853745.74849: checking to see if all hosts have failed and the running result is not ok 30583 1726853745.74850: done checking to see if all hosts have failed 30583 1726853745.74851: getting the remaining hosts for this loop 30583 1726853745.74853: done getting the remaining hosts for this loop 30583 1726853745.74856: getting the next task for host managed_node2 30583 1726853745.74869: done getting next task for host managed_node2 30583 1726853745.74874: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853745.74879: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853745.74893: WORKER PROCESS EXITING 30583 1726853745.74901: getting variables 30583 1726853745.74903: in VariableManager get_vars() 30583 1726853745.74953: Calling all_inventory to load vars for managed_node2 30583 1726853745.74956: Calling groups_inventory to load vars for managed_node2 30583 1726853745.74962: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853745.74975: Calling all_plugins_play to load vars for managed_node2 30583 1726853745.74979: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853745.74982: Calling groups_plugins_play to load vars for managed_node2 30583 1726853745.76546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853745.78268: done with get_vars() 30583 1726853745.78309: done getting variables 30583 1726853745.78369: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:35:45 -0400 (0:00:00.062) 0:01:21.121 ****** 30583 1726853745.78409: entering _queue_task() for managed_node2/debug 30583 1726853745.78752: worker is 1 (out of 1 available) 30583 1726853745.78767: exiting _queue_task() for managed_node2/debug 30583 1726853745.78781: done queuing things up, now waiting for results queue to drain 30583 1726853745.78783: waiting for pending results... 30583 1726853745.79180: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853745.79239: in run() - task 02083763-bbaf-05ea-abc5-00000000184e 30583 1726853745.79262: variable 'ansible_search_path' from source: unknown 30583 1726853745.79274: variable 'ansible_search_path' from source: unknown 30583 1726853745.79321: calling self._execute() 30583 1726853745.79423: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.79436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.79450: variable 'omit' from source: magic vars 30583 1726853745.79843: variable 'ansible_distribution_major_version' from source: facts 30583 1726853745.79864: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853745.80062: variable 'network_state' from source: role '' defaults 30583 1726853745.80066: Evaluated conditional (network_state != {}): False 30583 1726853745.80068: when evaluation is False, skipping this task 30583 1726853745.80072: _execute() done 30583 1726853745.80075: dumping result to json 30583 1726853745.80077: done dumping result, returning 30583 1726853745.80080: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-00000000184e] 30583 1726853745.80082: sending task result for task 02083763-bbaf-05ea-abc5-00000000184e 30583 1726853745.80151: done sending task result for task 02083763-bbaf-05ea-abc5-00000000184e 30583 1726853745.80154: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853745.80213: no more pending results, returning what we have 30583 1726853745.80217: results queue empty 30583 1726853745.80218: checking for any_errors_fatal 30583 1726853745.80228: done checking for any_errors_fatal 30583 1726853745.80229: checking for max_fail_percentage 30583 1726853745.80231: done checking for max_fail_percentage 30583 1726853745.80232: checking to see if all hosts have failed and the running result is not ok 30583 1726853745.80233: done checking to see if all hosts have failed 30583 1726853745.80233: getting the remaining hosts for this loop 30583 1726853745.80237: done getting the remaining hosts for this loop 30583 1726853745.80241: getting the next task for host managed_node2 30583 1726853745.80248: done getting next task for host managed_node2 30583 1726853745.80253: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853745.80261: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853745.80289: getting variables 30583 1726853745.80291: in VariableManager get_vars() 30583 1726853745.80335: Calling all_inventory to load vars for managed_node2 30583 1726853745.80339: Calling groups_inventory to load vars for managed_node2 30583 1726853745.80342: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853745.80353: Calling all_plugins_play to load vars for managed_node2 30583 1726853745.80357: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853745.80362: Calling groups_plugins_play to load vars for managed_node2 30583 1726853745.82052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853745.83662: done with get_vars() 30583 1726853745.83690: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:35:45 -0400 (0:00:00.053) 0:01:21.175 ****** 30583 1726853745.83793: entering _queue_task() for managed_node2/ping 30583 1726853745.84151: worker is 1 (out of 1 available) 30583 1726853745.84167: exiting _queue_task() for managed_node2/ping 30583 1726853745.84182: done queuing things up, now waiting for results queue to drain 30583 1726853745.84183: waiting for pending results... 30583 1726853745.84595: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853745.84669: in run() - task 02083763-bbaf-05ea-abc5-00000000184f 30583 1726853745.84696: variable 'ansible_search_path' from source: unknown 30583 1726853745.84711: variable 'ansible_search_path' from source: unknown 30583 1726853745.84769: calling self._execute() 30583 1726853745.84882: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.84906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.84910: variable 'omit' from source: magic vars 30583 1726853745.85323: variable 'ansible_distribution_major_version' from source: facts 30583 1726853745.85449: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853745.85452: variable 'omit' from source: magic vars 30583 1726853745.85455: variable 'omit' from source: magic vars 30583 1726853745.85468: variable 'omit' from source: magic vars 30583 1726853745.85514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853745.85561: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853745.85588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853745.85622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853745.85726: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853745.85765: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853745.85777: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.85785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.85895: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853745.85906: Set connection var ansible_timeout to 10 30583 1726853745.85912: Set connection var ansible_connection to ssh 30583 1726853745.85922: Set connection var ansible_shell_executable to /bin/sh 30583 1726853745.85932: Set connection var ansible_shell_type to sh 30583 1726853745.85948: Set connection var ansible_pipelining to False 30583 1726853745.85986: variable 'ansible_shell_executable' from source: unknown 30583 1726853745.86047: variable 'ansible_connection' from source: unknown 30583 1726853745.86050: variable 'ansible_module_compression' from source: unknown 30583 1726853745.86053: variable 'ansible_shell_type' from source: unknown 30583 1726853745.86055: variable 'ansible_shell_executable' from source: unknown 30583 1726853745.86057: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853745.86061: variable 'ansible_pipelining' from source: unknown 30583 1726853745.86063: variable 'ansible_timeout' from source: unknown 30583 1726853745.86065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853745.86476: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853745.86594: variable 'omit' from source: magic vars 30583 1726853745.86597: starting attempt loop 30583 1726853745.86600: running the handler 30583 1726853745.86602: _low_level_execute_command(): starting 30583 1726853745.86604: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853745.87592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.87609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853745.87624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853745.87649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853745.87757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853745.89517: stdout chunk (state=3): >>>/root <<< 30583 1726853745.89656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853745.89668: stdout chunk (state=3): >>><<< 30583 1726853745.89688: stderr chunk (state=3): >>><<< 30583 1726853745.89712: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853745.89731: _low_level_execute_command(): starting 30583 1726853745.89742: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666 `" && echo ansible-tmp-1726853745.8971918-34359-186972290007666="` echo /root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666 `" ) && sleep 0' 30583 1726853745.90802: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853745.90878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853745.90930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853745.90945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.91169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853745.91217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853745.91298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853745.93390: stdout chunk (state=3): >>>ansible-tmp-1726853745.8971918-34359-186972290007666=/root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666 <<< 30583 1726853745.93556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853745.93559: stdout chunk (state=3): >>><<< 30583 1726853745.93561: stderr chunk (state=3): >>><<< 30583 1726853745.93663: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853745.8971918-34359-186972290007666=/root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853745.93667: variable 'ansible_module_compression' from source: unknown 30583 1726853745.93689: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853745.93729: variable 'ansible_facts' from source: unknown 30583 1726853745.93821: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666/AnsiballZ_ping.py 30583 1726853745.94054: Sending initial data 30583 1726853745.94068: Sent initial data (153 bytes) 30583 1726853745.94633: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853745.94653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853745.94763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853745.96466: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853745.96554: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853745.96656: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpl5o5ko9b /root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666/AnsiballZ_ping.py <<< 30583 1726853745.96660: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666/AnsiballZ_ping.py" <<< 30583 1726853745.96719: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpl5o5ko9b" to remote "/root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666/AnsiballZ_ping.py" <<< 30583 1726853745.97738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853745.97850: stderr chunk (state=3): >>><<< 30583 1726853745.97853: stdout chunk (state=3): >>><<< 30583 1726853745.97856: done transferring module to remote 30583 1726853745.97858: _low_level_execute_command(): starting 30583 1726853745.97861: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666/ /root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666/AnsiballZ_ping.py && sleep 0' 30583 1726853745.99160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853745.99288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853745.99306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853745.99322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853745.99419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853746.01337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853746.01396: stderr chunk (state=3): >>><<< 30583 1726853746.01405: stdout chunk (state=3): >>><<< 30583 1726853746.01428: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853746.01446: _low_level_execute_command(): starting 30583 1726853746.01456: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666/AnsiballZ_ping.py && sleep 0' 30583 1726853746.02106: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853746.02166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853746.02185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853746.02212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853746.02325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853746.18307: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853746.19638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853746.19666: stderr chunk (state=3): >>><<< 30583 1726853746.19675: stdout chunk (state=3): >>><<< 30583 1726853746.19697: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853746.19724: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853746.19729: _low_level_execute_command(): starting 30583 1726853746.19734: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853745.8971918-34359-186972290007666/ > /dev/null 2>&1 && sleep 0' 30583 1726853746.20167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853746.20172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853746.20175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853746.20177: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853746.20186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853746.20239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853746.20243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853746.20258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853746.20355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853746.22359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853746.22363: stdout chunk (state=3): >>><<< 30583 1726853746.22365: stderr chunk (state=3): >>><<< 30583 1726853746.22384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853746.22395: handler run complete 30583 1726853746.22576: attempt loop complete, returning result 30583 1726853746.22580: _execute() done 30583 1726853746.22582: dumping result to json 30583 1726853746.22584: done dumping result, returning 30583 1726853746.22586: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-00000000184f] 30583 1726853746.22588: sending task result for task 02083763-bbaf-05ea-abc5-00000000184f 30583 1726853746.22658: done sending task result for task 02083763-bbaf-05ea-abc5-00000000184f 30583 1726853746.22661: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853746.22737: no more pending results, returning what we have 30583 1726853746.22740: results queue empty 30583 1726853746.22742: checking for any_errors_fatal 30583 1726853746.22750: done checking for any_errors_fatal 30583 1726853746.22751: checking for max_fail_percentage 30583 1726853746.22753: done checking for max_fail_percentage 30583 1726853746.22754: checking to see if all hosts have failed and the running result is not ok 30583 1726853746.22755: done checking to see if all hosts have failed 30583 1726853746.22756: getting the remaining hosts for this loop 30583 1726853746.22757: done getting the remaining hosts for this loop 30583 1726853746.22761: getting the next task for host managed_node2 30583 1726853746.22775: done getting next task for host managed_node2 30583 1726853746.22778: ^ task is: TASK: meta (role_complete) 30583 1726853746.22783: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853746.22796: getting variables 30583 1726853746.22798: in VariableManager get_vars() 30583 1726853746.22845: Calling all_inventory to load vars for managed_node2 30583 1726853746.22848: Calling groups_inventory to load vars for managed_node2 30583 1726853746.22850: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853746.22861: Calling all_plugins_play to load vars for managed_node2 30583 1726853746.22865: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853746.22868: Calling groups_plugins_play to load vars for managed_node2 30583 1726853746.25356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853746.28908: done with get_vars() 30583 1726853746.29001: done getting variables 30583 1726853746.29207: done queuing things up, now waiting for results queue to drain 30583 1726853746.29210: results queue empty 30583 1726853746.29210: checking for any_errors_fatal 30583 1726853746.29214: done checking for any_errors_fatal 30583 1726853746.29214: checking for max_fail_percentage 30583 1726853746.29215: done checking for max_fail_percentage 30583 1726853746.29216: checking to see if all hosts have failed and the running result is not ok 30583 1726853746.29217: done checking to see if all hosts have failed 30583 1726853746.29217: getting the remaining hosts for this loop 30583 1726853746.29219: done getting the remaining hosts for this loop 30583 1726853746.29221: getting the next task for host managed_node2 30583 1726853746.29226: done getting next task for host managed_node2 30583 1726853746.29228: ^ task is: TASK: Show result 30583 1726853746.29231: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853746.29233: getting variables 30583 1726853746.29234: in VariableManager get_vars() 30583 1726853746.29361: Calling all_inventory to load vars for managed_node2 30583 1726853746.29364: Calling groups_inventory to load vars for managed_node2 30583 1726853746.29367: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853746.29374: Calling all_plugins_play to load vars for managed_node2 30583 1726853746.29376: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853746.29379: Calling groups_plugins_play to load vars for managed_node2 30583 1726853746.31831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853746.35211: done with get_vars() 30583 1726853746.35245: done getting variables 30583 1726853746.35296: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 13:35:46 -0400 (0:00:00.515) 0:01:21.690 ****** 30583 1726853746.35330: entering _queue_task() for managed_node2/debug 30583 1726853746.36115: worker is 1 (out of 1 available) 30583 1726853746.36131: exiting _queue_task() for managed_node2/debug 30583 1726853746.36144: done queuing things up, now waiting for results queue to drain 30583 1726853746.36146: waiting for pending results... 30583 1726853746.36985: running TaskExecutor() for managed_node2/TASK: Show result 30583 1726853746.36991: in run() - task 02083763-bbaf-05ea-abc5-0000000017d1 30583 1726853746.37182: variable 'ansible_search_path' from source: unknown 30583 1726853746.37186: variable 'ansible_search_path' from source: unknown 30583 1726853746.37188: calling self._execute() 30583 1726853746.37339: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853746.37385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853746.37409: variable 'omit' from source: magic vars 30583 1726853746.38377: variable 'ansible_distribution_major_version' from source: facts 30583 1726853746.38380: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853746.38384: variable 'omit' from source: magic vars 30583 1726853746.38387: variable 'omit' from source: magic vars 30583 1726853746.38503: variable 'omit' from source: magic vars 30583 1726853746.38724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853746.38728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853746.38731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853746.38879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853746.38886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853746.38909: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853746.38919: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853746.38926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853746.39188: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853746.39206: Set connection var ansible_timeout to 10 30583 1726853746.39215: Set connection var ansible_connection to ssh 30583 1726853746.39278: Set connection var ansible_shell_executable to /bin/sh 30583 1726853746.39285: Set connection var ansible_shell_type to sh 30583 1726853746.39300: Set connection var ansible_pipelining to False 30583 1726853746.39334: variable 'ansible_shell_executable' from source: unknown 30583 1726853746.39384: variable 'ansible_connection' from source: unknown 30583 1726853746.39392: variable 'ansible_module_compression' from source: unknown 30583 1726853746.39399: variable 'ansible_shell_type' from source: unknown 30583 1726853746.39405: variable 'ansible_shell_executable' from source: unknown 30583 1726853746.39410: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853746.39640: variable 'ansible_pipelining' from source: unknown 30583 1726853746.39647: variable 'ansible_timeout' from source: unknown 30583 1726853746.39650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853746.39866: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853746.39887: variable 'omit' from source: magic vars 30583 1726853746.39896: starting attempt loop 30583 1726853746.39907: running the handler 30583 1726853746.39962: variable '__network_connections_result' from source: set_fact 30583 1726853746.40232: variable '__network_connections_result' from source: set_fact 30583 1726853746.40464: handler run complete 30583 1726853746.40499: attempt loop complete, returning result 30583 1726853746.40621: _execute() done 30583 1726853746.40624: dumping result to json 30583 1726853746.40626: done dumping result, returning 30583 1726853746.40629: done running TaskExecutor() for managed_node2/TASK: Show result [02083763-bbaf-05ea-abc5-0000000017d1] 30583 1726853746.40630: sending task result for task 02083763-bbaf-05ea-abc5-0000000017d1 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954" ] } } 30583 1726853746.40894: no more pending results, returning what we have 30583 1726853746.40897: results queue empty 30583 1726853746.40899: checking for any_errors_fatal 30583 1726853746.40900: done checking for any_errors_fatal 30583 1726853746.40901: checking for max_fail_percentage 30583 1726853746.40903: done checking for max_fail_percentage 30583 1726853746.40904: checking to see if all hosts have failed and the running result is not ok 30583 1726853746.40904: done checking to see if all hosts have failed 30583 1726853746.40905: getting the remaining hosts for this loop 30583 1726853746.40907: done getting the remaining hosts for this loop 30583 1726853746.40910: getting the next task for host managed_node2 30583 1726853746.40921: done getting next task for host managed_node2 30583 1726853746.40924: ^ task is: TASK: Include network role 30583 1726853746.40927: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853746.40931: getting variables 30583 1726853746.40933: in VariableManager get_vars() 30583 1726853746.40967: Calling all_inventory to load vars for managed_node2 30583 1726853746.40969: Calling groups_inventory to load vars for managed_node2 30583 1726853746.40974: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853746.40981: done sending task result for task 02083763-bbaf-05ea-abc5-0000000017d1 30583 1726853746.40984: WORKER PROCESS EXITING 30583 1726853746.41181: Calling all_plugins_play to load vars for managed_node2 30583 1726853746.41184: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853746.41187: Calling groups_plugins_play to load vars for managed_node2 30583 1726853746.44056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853746.47188: done with get_vars() 30583 1726853746.47216: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 13:35:46 -0400 (0:00:00.119) 0:01:21.810 ****** 30583 1726853746.47315: entering _queue_task() for managed_node2/include_role 30583 1726853746.48069: worker is 1 (out of 1 available) 30583 1726853746.48084: exiting _queue_task() for managed_node2/include_role 30583 1726853746.48099: done queuing things up, now waiting for results queue to drain 30583 1726853746.48100: waiting for pending results... 30583 1726853746.48615: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853746.49179: in run() - task 02083763-bbaf-05ea-abc5-0000000017d5 30583 1726853746.49183: variable 'ansible_search_path' from source: unknown 30583 1726853746.49185: variable 'ansible_search_path' from source: unknown 30583 1726853746.49188: calling self._execute() 30583 1726853746.49497: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853746.49501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853746.49505: variable 'omit' from source: magic vars 30583 1726853746.50139: variable 'ansible_distribution_major_version' from source: facts 30583 1726853746.50377: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853746.50382: _execute() done 30583 1726853746.50384: dumping result to json 30583 1726853746.50387: done dumping result, returning 30583 1726853746.50390: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-0000000017d5] 30583 1726853746.50393: sending task result for task 02083763-bbaf-05ea-abc5-0000000017d5 30583 1726853746.50620: no more pending results, returning what we have 30583 1726853746.50625: in VariableManager get_vars() 30583 1726853746.50666: Calling all_inventory to load vars for managed_node2 30583 1726853746.50669: Calling groups_inventory to load vars for managed_node2 30583 1726853746.50674: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853746.50687: Calling all_plugins_play to load vars for managed_node2 30583 1726853746.50690: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853746.50694: Calling groups_plugins_play to load vars for managed_node2 30583 1726853746.51384: done sending task result for task 02083763-bbaf-05ea-abc5-0000000017d5 30583 1726853746.51388: WORKER PROCESS EXITING 30583 1726853746.53516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853746.56092: done with get_vars() 30583 1726853746.56114: variable 'ansible_search_path' from source: unknown 30583 1726853746.56115: variable 'ansible_search_path' from source: unknown 30583 1726853746.56255: variable 'omit' from source: magic vars 30583 1726853746.56296: variable 'omit' from source: magic vars 30583 1726853746.56311: variable 'omit' from source: magic vars 30583 1726853746.56314: we have included files to process 30583 1726853746.56315: generating all_blocks data 30583 1726853746.56317: done generating all_blocks data 30583 1726853746.56322: processing included file: fedora.linux_system_roles.network 30583 1726853746.56342: in VariableManager get_vars() 30583 1726853746.56356: done with get_vars() 30583 1726853746.56384: in VariableManager get_vars() 30583 1726853746.56402: done with get_vars() 30583 1726853746.56438: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853746.56611: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853746.56809: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853746.57252: in VariableManager get_vars() 30583 1726853746.57274: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853746.59514: iterating over new_blocks loaded from include file 30583 1726853746.59516: in VariableManager get_vars() 30583 1726853746.59535: done with get_vars() 30583 1726853746.59537: filtering new block on tags 30583 1726853746.59806: done filtering new block on tags 30583 1726853746.59809: in VariableManager get_vars() 30583 1726853746.59824: done with get_vars() 30583 1726853746.59826: filtering new block on tags 30583 1726853746.59841: done filtering new block on tags 30583 1726853746.59843: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853746.59848: extending task lists for all hosts with included blocks 30583 1726853746.59955: done extending task lists 30583 1726853746.59957: done processing included files 30583 1726853746.59958: results queue empty 30583 1726853746.59958: checking for any_errors_fatal 30583 1726853746.59963: done checking for any_errors_fatal 30583 1726853746.59964: checking for max_fail_percentage 30583 1726853746.59965: done checking for max_fail_percentage 30583 1726853746.59966: checking to see if all hosts have failed and the running result is not ok 30583 1726853746.59966: done checking to see if all hosts have failed 30583 1726853746.59967: getting the remaining hosts for this loop 30583 1726853746.59968: done getting the remaining hosts for this loop 30583 1726853746.59972: getting the next task for host managed_node2 30583 1726853746.59977: done getting next task for host managed_node2 30583 1726853746.59980: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853746.59984: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853746.59994: getting variables 30583 1726853746.59995: in VariableManager get_vars() 30583 1726853746.60008: Calling all_inventory to load vars for managed_node2 30583 1726853746.60011: Calling groups_inventory to load vars for managed_node2 30583 1726853746.60013: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853746.60017: Calling all_plugins_play to load vars for managed_node2 30583 1726853746.60019: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853746.60022: Calling groups_plugins_play to load vars for managed_node2 30583 1726853746.61323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853746.62824: done with get_vars() 30583 1726853746.62846: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:35:46 -0400 (0:00:00.156) 0:01:21.966 ****** 30583 1726853746.62923: entering _queue_task() for managed_node2/include_tasks 30583 1726853746.63292: worker is 1 (out of 1 available) 30583 1726853746.63304: exiting _queue_task() for managed_node2/include_tasks 30583 1726853746.63317: done queuing things up, now waiting for results queue to drain 30583 1726853746.63318: waiting for pending results... 30583 1726853746.63622: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853746.63831: in run() - task 02083763-bbaf-05ea-abc5-0000000019bf 30583 1726853746.63853: variable 'ansible_search_path' from source: unknown 30583 1726853746.63862: variable 'ansible_search_path' from source: unknown 30583 1726853746.63908: calling self._execute() 30583 1726853746.64018: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853746.64030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853746.64047: variable 'omit' from source: magic vars 30583 1726853746.64462: variable 'ansible_distribution_major_version' from source: facts 30583 1726853746.64483: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853746.64493: _execute() done 30583 1726853746.64501: dumping result to json 30583 1726853746.64508: done dumping result, returning 30583 1726853746.64519: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-0000000019bf] 30583 1726853746.64528: sending task result for task 02083763-bbaf-05ea-abc5-0000000019bf 30583 1726853746.64645: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019bf 30583 1726853746.64712: no more pending results, returning what we have 30583 1726853746.64738: in VariableManager get_vars() 30583 1726853746.64790: Calling all_inventory to load vars for managed_node2 30583 1726853746.64793: Calling groups_inventory to load vars for managed_node2 30583 1726853746.64795: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853746.64807: Calling all_plugins_play to load vars for managed_node2 30583 1726853746.64810: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853746.64814: Calling groups_plugins_play to load vars for managed_node2 30583 1726853746.65877: WORKER PROCESS EXITING 30583 1726853746.67303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853746.68946: done with get_vars() 30583 1726853746.68968: variable 'ansible_search_path' from source: unknown 30583 1726853746.68969: variable 'ansible_search_path' from source: unknown 30583 1726853746.69201: we have included files to process 30583 1726853746.69203: generating all_blocks data 30583 1726853746.69204: done generating all_blocks data 30583 1726853746.69208: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853746.69216: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853746.69220: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853746.70317: done processing included file 30583 1726853746.70319: iterating over new_blocks loaded from include file 30583 1726853746.70321: in VariableManager get_vars() 30583 1726853746.70348: done with get_vars() 30583 1726853746.70350: filtering new block on tags 30583 1726853746.70487: done filtering new block on tags 30583 1726853746.70491: in VariableManager get_vars() 30583 1726853746.70514: done with get_vars() 30583 1726853746.70516: filtering new block on tags 30583 1726853746.70561: done filtering new block on tags 30583 1726853746.70564: in VariableManager get_vars() 30583 1726853746.70797: done with get_vars() 30583 1726853746.70799: filtering new block on tags 30583 1726853746.70842: done filtering new block on tags 30583 1726853746.70845: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853746.70851: extending task lists for all hosts with included blocks 30583 1726853746.73441: done extending task lists 30583 1726853746.73442: done processing included files 30583 1726853746.73443: results queue empty 30583 1726853746.73444: checking for any_errors_fatal 30583 1726853746.73448: done checking for any_errors_fatal 30583 1726853746.73448: checking for max_fail_percentage 30583 1726853746.73450: done checking for max_fail_percentage 30583 1726853746.73451: checking to see if all hosts have failed and the running result is not ok 30583 1726853746.73451: done checking to see if all hosts have failed 30583 1726853746.73452: getting the remaining hosts for this loop 30583 1726853746.73453: done getting the remaining hosts for this loop 30583 1726853746.73456: getting the next task for host managed_node2 30583 1726853746.73461: done getting next task for host managed_node2 30583 1726853746.73464: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853746.73468: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853746.73482: getting variables 30583 1726853746.73483: in VariableManager get_vars() 30583 1726853746.73499: Calling all_inventory to load vars for managed_node2 30583 1726853746.73502: Calling groups_inventory to load vars for managed_node2 30583 1726853746.73504: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853746.73510: Calling all_plugins_play to load vars for managed_node2 30583 1726853746.73512: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853746.73515: Calling groups_plugins_play to load vars for managed_node2 30583 1726853746.74697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853746.77408: done with get_vars() 30583 1726853746.77439: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:35:46 -0400 (0:00:00.145) 0:01:22.112 ****** 30583 1726853746.77525: entering _queue_task() for managed_node2/setup 30583 1726853746.77907: worker is 1 (out of 1 available) 30583 1726853746.77921: exiting _queue_task() for managed_node2/setup 30583 1726853746.77935: done queuing things up, now waiting for results queue to drain 30583 1726853746.77936: waiting for pending results... 30583 1726853746.78247: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853746.78430: in run() - task 02083763-bbaf-05ea-abc5-000000001a16 30583 1726853746.78452: variable 'ansible_search_path' from source: unknown 30583 1726853746.78461: variable 'ansible_search_path' from source: unknown 30583 1726853746.78509: calling self._execute() 30583 1726853746.78608: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853746.78623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853746.78637: variable 'omit' from source: magic vars 30583 1726853746.79226: variable 'ansible_distribution_major_version' from source: facts 30583 1726853746.79400: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853746.80022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853746.83842: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853746.83938: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853746.84011: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853746.84065: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853746.84101: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853746.84221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853746.84257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853746.84296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853746.84363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853746.84595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853746.84599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853746.84601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853746.84603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853746.84605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853746.84607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853746.84782: variable '__network_required_facts' from source: role '' defaults 30583 1726853746.84795: variable 'ansible_facts' from source: unknown 30583 1726853746.85563: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853746.85575: when evaluation is False, skipping this task 30583 1726853746.85582: _execute() done 30583 1726853746.85593: dumping result to json 30583 1726853746.85600: done dumping result, returning 30583 1726853746.85612: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-000000001a16] 30583 1726853746.85622: sending task result for task 02083763-bbaf-05ea-abc5-000000001a16 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853746.85908: no more pending results, returning what we have 30583 1726853746.85913: results queue empty 30583 1726853746.85914: checking for any_errors_fatal 30583 1726853746.85915: done checking for any_errors_fatal 30583 1726853746.85916: checking for max_fail_percentage 30583 1726853746.85919: done checking for max_fail_percentage 30583 1726853746.85920: checking to see if all hosts have failed and the running result is not ok 30583 1726853746.85920: done checking to see if all hosts have failed 30583 1726853746.85921: getting the remaining hosts for this loop 30583 1726853746.85924: done getting the remaining hosts for this loop 30583 1726853746.85928: getting the next task for host managed_node2 30583 1726853746.85940: done getting next task for host managed_node2 30583 1726853746.85944: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853746.85950: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853746.85977: getting variables 30583 1726853746.85979: in VariableManager get_vars() 30583 1726853746.86025: Calling all_inventory to load vars for managed_node2 30583 1726853746.86028: Calling groups_inventory to load vars for managed_node2 30583 1726853746.86030: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853746.86041: Calling all_plugins_play to load vars for managed_node2 30583 1726853746.86044: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853746.86047: Calling groups_plugins_play to load vars for managed_node2 30583 1726853746.86584: done sending task result for task 02083763-bbaf-05ea-abc5-000000001a16 30583 1726853746.86593: WORKER PROCESS EXITING 30583 1726853746.88239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853746.91675: done with get_vars() 30583 1726853746.91706: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:35:46 -0400 (0:00:00.146) 0:01:22.258 ****** 30583 1726853746.92137: entering _queue_task() for managed_node2/stat 30583 1726853746.92908: worker is 1 (out of 1 available) 30583 1726853746.92919: exiting _queue_task() for managed_node2/stat 30583 1726853746.92930: done queuing things up, now waiting for results queue to drain 30583 1726853746.92931: waiting for pending results... 30583 1726853746.93230: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853746.93418: in run() - task 02083763-bbaf-05ea-abc5-000000001a18 30583 1726853746.93439: variable 'ansible_search_path' from source: unknown 30583 1726853746.93447: variable 'ansible_search_path' from source: unknown 30583 1726853746.93495: calling self._execute() 30583 1726853746.93667: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853746.93681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853746.93737: variable 'omit' from source: magic vars 30583 1726853746.94418: variable 'ansible_distribution_major_version' from source: facts 30583 1726853746.94463: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853746.94898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853746.95436: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853746.95513: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853746.95614: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853746.95665: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853746.95753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853746.95883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853746.95887: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853746.95889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853746.96008: variable '__network_is_ostree' from source: set_fact 30583 1726853746.96048: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853746.96063: when evaluation is False, skipping this task 30583 1726853746.96095: _execute() done 30583 1726853746.96110: dumping result to json 30583 1726853746.96130: done dumping result, returning 30583 1726853746.96227: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-000000001a18] 30583 1726853746.96236: sending task result for task 02083763-bbaf-05ea-abc5-000000001a18 30583 1726853746.96322: done sending task result for task 02083763-bbaf-05ea-abc5-000000001a18 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853746.96546: no more pending results, returning what we have 30583 1726853746.96550: results queue empty 30583 1726853746.96551: checking for any_errors_fatal 30583 1726853746.96562: done checking for any_errors_fatal 30583 1726853746.96563: checking for max_fail_percentage 30583 1726853746.96568: done checking for max_fail_percentage 30583 1726853746.96569: checking to see if all hosts have failed and the running result is not ok 30583 1726853746.96570: done checking to see if all hosts have failed 30583 1726853746.96572: getting the remaining hosts for this loop 30583 1726853746.96577: done getting the remaining hosts for this loop 30583 1726853746.96585: getting the next task for host managed_node2 30583 1726853746.96595: done getting next task for host managed_node2 30583 1726853746.96602: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853746.96611: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853746.96644: getting variables 30583 1726853746.96646: in VariableManager get_vars() 30583 1726853746.96997: Calling all_inventory to load vars for managed_node2 30583 1726853746.97000: Calling groups_inventory to load vars for managed_node2 30583 1726853746.97003: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853746.97011: WORKER PROCESS EXITING 30583 1726853746.97022: Calling all_plugins_play to load vars for managed_node2 30583 1726853746.97026: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853746.97029: Calling groups_plugins_play to load vars for managed_node2 30583 1726853747.08597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853747.10149: done with get_vars() 30583 1726853747.10177: done getting variables 30583 1726853747.10224: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:35:47 -0400 (0:00:00.181) 0:01:22.439 ****** 30583 1726853747.10259: entering _queue_task() for managed_node2/set_fact 30583 1726853747.10625: worker is 1 (out of 1 available) 30583 1726853747.10637: exiting _queue_task() for managed_node2/set_fact 30583 1726853747.10649: done queuing things up, now waiting for results queue to drain 30583 1726853747.10651: waiting for pending results... 30583 1726853747.10961: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853747.11161: in run() - task 02083763-bbaf-05ea-abc5-000000001a19 30583 1726853747.11185: variable 'ansible_search_path' from source: unknown 30583 1726853747.11193: variable 'ansible_search_path' from source: unknown 30583 1726853747.11240: calling self._execute() 30583 1726853747.11348: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853747.11362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853747.11477: variable 'omit' from source: magic vars 30583 1726853747.11768: variable 'ansible_distribution_major_version' from source: facts 30583 1726853747.11789: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853747.11954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853747.12577: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853747.12580: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853747.12736: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853747.12779: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853747.13047: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853747.13137: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853747.13141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853747.13144: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853747.13335: variable '__network_is_ostree' from source: set_fact 30583 1726853747.13363: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853747.13572: when evaluation is False, skipping this task 30583 1726853747.13576: _execute() done 30583 1726853747.13579: dumping result to json 30583 1726853747.13581: done dumping result, returning 30583 1726853747.13584: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-000000001a19] 30583 1726853747.13586: sending task result for task 02083763-bbaf-05ea-abc5-000000001a19 30583 1726853747.13657: done sending task result for task 02083763-bbaf-05ea-abc5-000000001a19 30583 1726853747.13660: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853747.13722: no more pending results, returning what we have 30583 1726853747.13726: results queue empty 30583 1726853747.13727: checking for any_errors_fatal 30583 1726853747.13737: done checking for any_errors_fatal 30583 1726853747.13738: checking for max_fail_percentage 30583 1726853747.13740: done checking for max_fail_percentage 30583 1726853747.13742: checking to see if all hosts have failed and the running result is not ok 30583 1726853747.13743: done checking to see if all hosts have failed 30583 1726853747.13743: getting the remaining hosts for this loop 30583 1726853747.13745: done getting the remaining hosts for this loop 30583 1726853747.13749: getting the next task for host managed_node2 30583 1726853747.13762: done getting next task for host managed_node2 30583 1726853747.13766: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853747.13775: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853747.13803: getting variables 30583 1726853747.13805: in VariableManager get_vars() 30583 1726853747.13852: Calling all_inventory to load vars for managed_node2 30583 1726853747.13855: Calling groups_inventory to load vars for managed_node2 30583 1726853747.13858: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853747.13870: Calling all_plugins_play to load vars for managed_node2 30583 1726853747.14078: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853747.14082: Calling groups_plugins_play to load vars for managed_node2 30583 1726853747.16759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853747.20126: done with get_vars() 30583 1726853747.20155: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:35:47 -0400 (0:00:00.101) 0:01:22.541 ****** 30583 1726853747.20459: entering _queue_task() for managed_node2/service_facts 30583 1726853747.21229: worker is 1 (out of 1 available) 30583 1726853747.21244: exiting _queue_task() for managed_node2/service_facts 30583 1726853747.21258: done queuing things up, now waiting for results queue to drain 30583 1726853747.21259: waiting for pending results... 30583 1726853747.21895: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853747.22303: in run() - task 02083763-bbaf-05ea-abc5-000000001a1b 30583 1726853747.22317: variable 'ansible_search_path' from source: unknown 30583 1726853747.22321: variable 'ansible_search_path' from source: unknown 30583 1726853747.22478: calling self._execute() 30583 1726853747.22698: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853747.22702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853747.22714: variable 'omit' from source: magic vars 30583 1726853747.23368: variable 'ansible_distribution_major_version' from source: facts 30583 1726853747.23381: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853747.23388: variable 'omit' from source: magic vars 30583 1726853747.23468: variable 'omit' from source: magic vars 30583 1726853747.23502: variable 'omit' from source: magic vars 30583 1726853747.23546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853747.23586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853747.23605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853747.23624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853747.23690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853747.23694: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853747.23697: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853747.23700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853747.23844: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853747.23850: Set connection var ansible_timeout to 10 30583 1726853747.23853: Set connection var ansible_connection to ssh 30583 1726853747.23856: Set connection var ansible_shell_executable to /bin/sh 30583 1726853747.23858: Set connection var ansible_shell_type to sh 30583 1726853747.23860: Set connection var ansible_pipelining to False 30583 1726853747.23862: variable 'ansible_shell_executable' from source: unknown 30583 1726853747.23865: variable 'ansible_connection' from source: unknown 30583 1726853747.23867: variable 'ansible_module_compression' from source: unknown 30583 1726853747.23870: variable 'ansible_shell_type' from source: unknown 30583 1726853747.23874: variable 'ansible_shell_executable' from source: unknown 30583 1726853747.23877: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853747.23879: variable 'ansible_pipelining' from source: unknown 30583 1726853747.23882: variable 'ansible_timeout' from source: unknown 30583 1726853747.23884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853747.24060: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853747.24069: variable 'omit' from source: magic vars 30583 1726853747.24078: starting attempt loop 30583 1726853747.24082: running the handler 30583 1726853747.24170: _low_level_execute_command(): starting 30583 1726853747.24175: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853747.24831: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853747.24981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853747.25057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853747.26840: stdout chunk (state=3): >>>/root <<< 30583 1726853747.26984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853747.27004: stdout chunk (state=3): >>><<< 30583 1726853747.27029: stderr chunk (state=3): >>><<< 30583 1726853747.27060: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853747.27175: _low_level_execute_command(): starting 30583 1726853747.27181: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313 `" && echo ansible-tmp-1726853747.270677-34400-68051520371313="` echo /root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313 `" ) && sleep 0' 30583 1726853747.27678: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853747.27703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853747.27776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853747.27779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853747.27782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853747.27785: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853747.27788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853747.27810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853747.27814: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853747.27816: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853747.27819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853747.27821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853747.27823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853747.27825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853747.27827: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853747.27893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853747.27896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853747.27997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853747.30108: stdout chunk (state=3): >>>ansible-tmp-1726853747.270677-34400-68051520371313=/root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313 <<< 30583 1726853747.30163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853747.30257: stderr chunk (state=3): >>><<< 30583 1726853747.30476: stdout chunk (state=3): >>><<< 30583 1726853747.30480: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853747.270677-34400-68051520371313=/root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853747.30483: variable 'ansible_module_compression' from source: unknown 30583 1726853747.30598: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853747.30645: variable 'ansible_facts' from source: unknown 30583 1726853747.31079: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313/AnsiballZ_service_facts.py 30583 1726853747.31423: Sending initial data 30583 1726853747.31430: Sent initial data (160 bytes) 30583 1726853747.32860: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853747.32866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853747.33091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853747.33155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853747.33167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853747.33290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853747.35074: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853747.35118: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853747.35261: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpw1lw7lm4 /root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313/AnsiballZ_service_facts.py <<< 30583 1726853747.35265: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313/AnsiballZ_service_facts.py" <<< 30583 1726853747.35348: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpw1lw7lm4" to remote "/root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313/AnsiballZ_service_facts.py" <<< 30583 1726853747.37109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853747.37192: stderr chunk (state=3): >>><<< 30583 1726853747.37196: stdout chunk (state=3): >>><<< 30583 1726853747.37214: done transferring module to remote 30583 1726853747.37369: _low_level_execute_command(): starting 30583 1726853747.37382: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313/ /root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313/AnsiballZ_service_facts.py && sleep 0' 30583 1726853747.38499: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853747.38513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853747.38530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853747.38594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853747.38601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853747.38685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853747.40579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853747.40653: stderr chunk (state=3): >>><<< 30583 1726853747.40656: stdout chunk (state=3): >>><<< 30583 1726853747.40877: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853747.40880: _low_level_execute_command(): starting 30583 1726853747.40883: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313/AnsiballZ_service_facts.py && sleep 0' 30583 1726853747.42042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853747.42057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853747.42072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853747.42191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853749.06362: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30583 1726853749.06430: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 30583 1726853749.06476: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853749.08036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853749.08063: stderr chunk (state=3): >>><<< 30583 1726853749.08067: stdout chunk (state=3): >>><<< 30583 1726853749.08091: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853749.08823: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853749.08827: _low_level_execute_command(): starting 30583 1726853749.08846: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853747.270677-34400-68051520371313/ > /dev/null 2>&1 && sleep 0' 30583 1726853749.09576: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853749.09637: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853749.09744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853749.09957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853749.11979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853749.11983: stdout chunk (state=3): >>><<< 30583 1726853749.11985: stderr chunk (state=3): >>><<< 30583 1726853749.11988: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853749.11991: handler run complete 30583 1726853749.12264: variable 'ansible_facts' from source: unknown 30583 1726853749.12615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853749.13648: variable 'ansible_facts' from source: unknown 30583 1726853749.13991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853749.14224: attempt loop complete, returning result 30583 1726853749.14236: _execute() done 30583 1726853749.14247: dumping result to json 30583 1726853749.14320: done dumping result, returning 30583 1726853749.14333: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-000000001a1b] 30583 1726853749.14343: sending task result for task 02083763-bbaf-05ea-abc5-000000001a1b 30583 1726853749.15848: done sending task result for task 02083763-bbaf-05ea-abc5-000000001a1b 30583 1726853749.15851: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853749.15962: no more pending results, returning what we have 30583 1726853749.15965: results queue empty 30583 1726853749.15966: checking for any_errors_fatal 30583 1726853749.15969: done checking for any_errors_fatal 30583 1726853749.15970: checking for max_fail_percentage 30583 1726853749.15974: done checking for max_fail_percentage 30583 1726853749.15974: checking to see if all hosts have failed and the running result is not ok 30583 1726853749.15975: done checking to see if all hosts have failed 30583 1726853749.15976: getting the remaining hosts for this loop 30583 1726853749.15977: done getting the remaining hosts for this loop 30583 1726853749.15981: getting the next task for host managed_node2 30583 1726853749.15988: done getting next task for host managed_node2 30583 1726853749.15991: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853749.16000: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853749.16021: getting variables 30583 1726853749.16023: in VariableManager get_vars() 30583 1726853749.16053: Calling all_inventory to load vars for managed_node2 30583 1726853749.16055: Calling groups_inventory to load vars for managed_node2 30583 1726853749.16058: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853749.16067: Calling all_plugins_play to load vars for managed_node2 30583 1726853749.16070: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853749.16076: Calling groups_plugins_play to load vars for managed_node2 30583 1726853749.17821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853749.20218: done with get_vars() 30583 1726853749.20257: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:35:49 -0400 (0:00:01.999) 0:01:24.540 ****** 30583 1726853749.20374: entering _queue_task() for managed_node2/package_facts 30583 1726853749.20768: worker is 1 (out of 1 available) 30583 1726853749.20787: exiting _queue_task() for managed_node2/package_facts 30583 1726853749.20804: done queuing things up, now waiting for results queue to drain 30583 1726853749.20806: waiting for pending results... 30583 1726853749.21146: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853749.21291: in run() - task 02083763-bbaf-05ea-abc5-000000001a1c 30583 1726853749.21318: variable 'ansible_search_path' from source: unknown 30583 1726853749.21327: variable 'ansible_search_path' from source: unknown 30583 1726853749.21379: calling self._execute() 30583 1726853749.22000: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853749.22004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853749.22007: variable 'omit' from source: magic vars 30583 1726853749.22519: variable 'ansible_distribution_major_version' from source: facts 30583 1726853749.22544: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853749.22560: variable 'omit' from source: magic vars 30583 1726853749.22670: variable 'omit' from source: magic vars 30583 1726853749.22711: variable 'omit' from source: magic vars 30583 1726853749.22762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853749.22799: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853749.22822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853749.22839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853749.22857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853749.22899: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853749.22906: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853749.22913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853749.23018: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853749.23028: Set connection var ansible_timeout to 10 30583 1726853749.23034: Set connection var ansible_connection to ssh 30583 1726853749.23043: Set connection var ansible_shell_executable to /bin/sh 30583 1726853749.23048: Set connection var ansible_shell_type to sh 30583 1726853749.23062: Set connection var ansible_pipelining to False 30583 1726853749.23104: variable 'ansible_shell_executable' from source: unknown 30583 1726853749.23113: variable 'ansible_connection' from source: unknown 30583 1726853749.23120: variable 'ansible_module_compression' from source: unknown 30583 1726853749.23127: variable 'ansible_shell_type' from source: unknown 30583 1726853749.23133: variable 'ansible_shell_executable' from source: unknown 30583 1726853749.23139: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853749.23146: variable 'ansible_pipelining' from source: unknown 30583 1726853749.23152: variable 'ansible_timeout' from source: unknown 30583 1726853749.23162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853749.23387: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853749.23412: variable 'omit' from source: magic vars 30583 1726853749.23421: starting attempt loop 30583 1726853749.23507: running the handler 30583 1726853749.23514: _low_level_execute_command(): starting 30583 1726853749.23517: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853749.24293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853749.24338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853749.24360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853749.24513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853749.24605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853749.26346: stdout chunk (state=3): >>>/root <<< 30583 1726853749.26452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853749.26501: stderr chunk (state=3): >>><<< 30583 1726853749.26517: stdout chunk (state=3): >>><<< 30583 1726853749.26565: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853749.26753: _low_level_execute_command(): starting 30583 1726853749.26757: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137 `" && echo ansible-tmp-1726853749.266627-34480-41457219787137="` echo /root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137 `" ) && sleep 0' 30583 1726853749.27780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853749.27992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853749.28104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853749.28192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853749.28395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853749.30454: stdout chunk (state=3): >>>ansible-tmp-1726853749.266627-34480-41457219787137=/root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137 <<< 30583 1726853749.30589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853749.30695: stderr chunk (state=3): >>><<< 30583 1726853749.30709: stdout chunk (state=3): >>><<< 30583 1726853749.30733: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853749.266627-34480-41457219787137=/root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853749.30831: variable 'ansible_module_compression' from source: unknown 30583 1726853749.30979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853749.31097: variable 'ansible_facts' from source: unknown 30583 1726853749.31577: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137/AnsiballZ_package_facts.py 30583 1726853749.31803: Sending initial data 30583 1726853749.31807: Sent initial data (160 bytes) 30583 1726853749.33116: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853749.33215: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853749.33448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853749.33555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853749.33562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853749.35252: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30583 1726853749.35260: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30583 1726853749.35399: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853749.35636: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpqlfaqy2f /root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137/AnsiballZ_package_facts.py <<< 30583 1726853749.35639: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137/AnsiballZ_package_facts.py" <<< 30583 1726853749.35719: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpqlfaqy2f" to remote "/root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137/AnsiballZ_package_facts.py" <<< 30583 1726853749.39267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853749.39272: stdout chunk (state=3): >>><<< 30583 1726853749.39279: stderr chunk (state=3): >>><<< 30583 1726853749.39320: done transferring module to remote 30583 1726853749.39324: _low_level_execute_command(): starting 30583 1726853749.39326: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137/ /root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137/AnsiballZ_package_facts.py && sleep 0' 30583 1726853749.40793: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853749.40945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853749.40951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853749.41076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853749.43030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853749.43034: stdout chunk (state=3): >>><<< 30583 1726853749.43055: stderr chunk (state=3): >>><<< 30583 1726853749.43059: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853749.43068: _low_level_execute_command(): starting 30583 1726853749.43073: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137/AnsiballZ_package_facts.py && sleep 0' 30583 1726853749.44399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853749.44403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853749.44406: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853749.44507: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853749.44511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853749.44513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853749.44515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853749.44518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853749.44587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853749.44791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853749.90338: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30583 1726853749.90481: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30583 1726853749.90559: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30583 1726853749.90596: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853749.92438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853749.92455: stdout chunk (state=3): >>><<< 30583 1726853749.92477: stderr chunk (state=3): >>><<< 30583 1726853749.92545: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853749.95179: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853749.95183: _low_level_execute_command(): starting 30583 1726853749.95185: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853749.266627-34480-41457219787137/ > /dev/null 2>&1 && sleep 0' 30583 1726853749.95880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853749.95930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853749.95946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853749.95985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853749.96109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853749.98104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853749.98107: stdout chunk (state=3): >>><<< 30583 1726853749.98110: stderr chunk (state=3): >>><<< 30583 1726853749.98177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853749.98181: handler run complete 30583 1726853749.99212: variable 'ansible_facts' from source: unknown 30583 1726853749.99794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853750.01593: variable 'ansible_facts' from source: unknown 30583 1726853750.02024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853750.02711: attempt loop complete, returning result 30583 1726853750.02721: _execute() done 30583 1726853750.02724: dumping result to json 30583 1726853750.02927: done dumping result, returning 30583 1726853750.02937: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-000000001a1c] 30583 1726853750.02940: sending task result for task 02083763-bbaf-05ea-abc5-000000001a1c 30583 1726853750.05334: done sending task result for task 02083763-bbaf-05ea-abc5-000000001a1c 30583 1726853750.05337: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853750.05510: no more pending results, returning what we have 30583 1726853750.05514: results queue empty 30583 1726853750.05515: checking for any_errors_fatal 30583 1726853750.05520: done checking for any_errors_fatal 30583 1726853750.05521: checking for max_fail_percentage 30583 1726853750.05523: done checking for max_fail_percentage 30583 1726853750.05524: checking to see if all hosts have failed and the running result is not ok 30583 1726853750.05525: done checking to see if all hosts have failed 30583 1726853750.05526: getting the remaining hosts for this loop 30583 1726853750.05528: done getting the remaining hosts for this loop 30583 1726853750.05532: getting the next task for host managed_node2 30583 1726853750.05540: done getting next task for host managed_node2 30583 1726853750.05551: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853750.05558: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853750.05573: getting variables 30583 1726853750.05575: in VariableManager get_vars() 30583 1726853750.05611: Calling all_inventory to load vars for managed_node2 30583 1726853750.05614: Calling groups_inventory to load vars for managed_node2 30583 1726853750.05617: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853750.05626: Calling all_plugins_play to load vars for managed_node2 30583 1726853750.05630: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853750.05633: Calling groups_plugins_play to load vars for managed_node2 30583 1726853750.06908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853750.08561: done with get_vars() 30583 1726853750.08597: done getting variables 30583 1726853750.08660: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:50 -0400 (0:00:00.883) 0:01:25.424 ****** 30583 1726853750.08712: entering _queue_task() for managed_node2/debug 30583 1726853750.09301: worker is 1 (out of 1 available) 30583 1726853750.09311: exiting _queue_task() for managed_node2/debug 30583 1726853750.09320: done queuing things up, now waiting for results queue to drain 30583 1726853750.09321: waiting for pending results... 30583 1726853750.09456: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853750.09591: in run() - task 02083763-bbaf-05ea-abc5-0000000019c0 30583 1726853750.09611: variable 'ansible_search_path' from source: unknown 30583 1726853750.09618: variable 'ansible_search_path' from source: unknown 30583 1726853750.09668: calling self._execute() 30583 1726853750.09881: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853750.09885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853750.09887: variable 'omit' from source: magic vars 30583 1726853750.10263: variable 'ansible_distribution_major_version' from source: facts 30583 1726853750.10285: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853750.10299: variable 'omit' from source: magic vars 30583 1726853750.10381: variable 'omit' from source: magic vars 30583 1726853750.10492: variable 'network_provider' from source: set_fact 30583 1726853750.10517: variable 'omit' from source: magic vars 30583 1726853750.10580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853750.10620: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853750.10658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853750.10682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853750.10700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853750.10747: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853750.10855: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853750.10858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853750.10885: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853750.10964: Set connection var ansible_timeout to 10 30583 1726853750.10970: Set connection var ansible_connection to ssh 30583 1726853750.10977: Set connection var ansible_shell_executable to /bin/sh 30583 1726853750.10979: Set connection var ansible_shell_type to sh 30583 1726853750.10981: Set connection var ansible_pipelining to False 30583 1726853750.10983: variable 'ansible_shell_executable' from source: unknown 30583 1726853750.10986: variable 'ansible_connection' from source: unknown 30583 1726853750.10989: variable 'ansible_module_compression' from source: unknown 30583 1726853750.10991: variable 'ansible_shell_type' from source: unknown 30583 1726853750.10993: variable 'ansible_shell_executable' from source: unknown 30583 1726853750.10995: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853750.10997: variable 'ansible_pipelining' from source: unknown 30583 1726853750.11002: variable 'ansible_timeout' from source: unknown 30583 1726853750.11011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853750.11159: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853750.11183: variable 'omit' from source: magic vars 30583 1726853750.11198: starting attempt loop 30583 1726853750.11205: running the handler 30583 1726853750.11275: handler run complete 30583 1726853750.11279: attempt loop complete, returning result 30583 1726853750.11289: _execute() done 30583 1726853750.11292: dumping result to json 30583 1726853750.11302: done dumping result, returning 30583 1726853750.11398: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-0000000019c0] 30583 1726853750.11401: sending task result for task 02083763-bbaf-05ea-abc5-0000000019c0 30583 1726853750.11484: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019c0 30583 1726853750.11488: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853750.11593: no more pending results, returning what we have 30583 1726853750.11596: results queue empty 30583 1726853750.11597: checking for any_errors_fatal 30583 1726853750.11614: done checking for any_errors_fatal 30583 1726853750.11615: checking for max_fail_percentage 30583 1726853750.11617: done checking for max_fail_percentage 30583 1726853750.11618: checking to see if all hosts have failed and the running result is not ok 30583 1726853750.11618: done checking to see if all hosts have failed 30583 1726853750.11619: getting the remaining hosts for this loop 30583 1726853750.11621: done getting the remaining hosts for this loop 30583 1726853750.11624: getting the next task for host managed_node2 30583 1726853750.11633: done getting next task for host managed_node2 30583 1726853750.11638: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853750.11643: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853750.11656: getting variables 30583 1726853750.11658: in VariableManager get_vars() 30583 1726853750.11700: Calling all_inventory to load vars for managed_node2 30583 1726853750.11703: Calling groups_inventory to load vars for managed_node2 30583 1726853750.11706: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853750.11776: Calling all_plugins_play to load vars for managed_node2 30583 1726853750.11781: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853750.11785: Calling groups_plugins_play to load vars for managed_node2 30583 1726853750.13534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853750.15218: done with get_vars() 30583 1726853750.15248: done getting variables 30583 1726853750.15322: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:50 -0400 (0:00:00.066) 0:01:25.490 ****** 30583 1726853750.15368: entering _queue_task() for managed_node2/fail 30583 1726853750.15852: worker is 1 (out of 1 available) 30583 1726853750.15865: exiting _queue_task() for managed_node2/fail 30583 1726853750.15877: done queuing things up, now waiting for results queue to drain 30583 1726853750.15878: waiting for pending results... 30583 1726853750.16196: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853750.16273: in run() - task 02083763-bbaf-05ea-abc5-0000000019c1 30583 1726853750.16301: variable 'ansible_search_path' from source: unknown 30583 1726853750.16307: variable 'ansible_search_path' from source: unknown 30583 1726853750.16344: calling self._execute() 30583 1726853750.16476: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853750.16547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853750.16565: variable 'omit' from source: magic vars 30583 1726853750.16991: variable 'ansible_distribution_major_version' from source: facts 30583 1726853750.17009: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853750.17144: variable 'network_state' from source: role '' defaults 30583 1726853750.17168: Evaluated conditional (network_state != {}): False 30583 1726853750.17178: when evaluation is False, skipping this task 30583 1726853750.17185: _execute() done 30583 1726853750.17191: dumping result to json 30583 1726853750.17197: done dumping result, returning 30583 1726853750.17207: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-0000000019c1] 30583 1726853750.17215: sending task result for task 02083763-bbaf-05ea-abc5-0000000019c1 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853750.17497: no more pending results, returning what we have 30583 1726853750.17502: results queue empty 30583 1726853750.17504: checking for any_errors_fatal 30583 1726853750.17513: done checking for any_errors_fatal 30583 1726853750.17514: checking for max_fail_percentage 30583 1726853750.17516: done checking for max_fail_percentage 30583 1726853750.17517: checking to see if all hosts have failed and the running result is not ok 30583 1726853750.17518: done checking to see if all hosts have failed 30583 1726853750.17519: getting the remaining hosts for this loop 30583 1726853750.17521: done getting the remaining hosts for this loop 30583 1726853750.17525: getting the next task for host managed_node2 30583 1726853750.17536: done getting next task for host managed_node2 30583 1726853750.17541: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853750.17580: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853750.17614: getting variables 30583 1726853750.17616: in VariableManager get_vars() 30583 1726853750.17785: Calling all_inventory to load vars for managed_node2 30583 1726853750.17788: Calling groups_inventory to load vars for managed_node2 30583 1726853750.17791: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853750.17798: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019c1 30583 1726853750.17801: WORKER PROCESS EXITING 30583 1726853750.17811: Calling all_plugins_play to load vars for managed_node2 30583 1726853750.17815: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853750.17818: Calling groups_plugins_play to load vars for managed_node2 30583 1726853750.19818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853750.21740: done with get_vars() 30583 1726853750.21766: done getting variables 30583 1726853750.21831: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:50 -0400 (0:00:00.064) 0:01:25.555 ****** 30583 1726853750.21867: entering _queue_task() for managed_node2/fail 30583 1726853750.22227: worker is 1 (out of 1 available) 30583 1726853750.22382: exiting _queue_task() for managed_node2/fail 30583 1726853750.22400: done queuing things up, now waiting for results queue to drain 30583 1726853750.22402: waiting for pending results... 30583 1726853750.22615: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853750.22830: in run() - task 02083763-bbaf-05ea-abc5-0000000019c2 30583 1726853750.22870: variable 'ansible_search_path' from source: unknown 30583 1726853750.22881: variable 'ansible_search_path' from source: unknown 30583 1726853750.22925: calling self._execute() 30583 1726853750.23056: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853750.23179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853750.23184: variable 'omit' from source: magic vars 30583 1726853750.23908: variable 'ansible_distribution_major_version' from source: facts 30583 1726853750.23926: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853750.24165: variable 'network_state' from source: role '' defaults 30583 1726853750.24239: Evaluated conditional (network_state != {}): False 30583 1726853750.24249: when evaluation is False, skipping this task 30583 1726853750.24257: _execute() done 30583 1726853750.24264: dumping result to json 30583 1726853750.24274: done dumping result, returning 30583 1726853750.24377: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-0000000019c2] 30583 1726853750.24381: sending task result for task 02083763-bbaf-05ea-abc5-0000000019c2 30583 1726853750.24660: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019c2 30583 1726853750.24663: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853750.24719: no more pending results, returning what we have 30583 1726853750.24724: results queue empty 30583 1726853750.24725: checking for any_errors_fatal 30583 1726853750.24737: done checking for any_errors_fatal 30583 1726853750.24738: checking for max_fail_percentage 30583 1726853750.24741: done checking for max_fail_percentage 30583 1726853750.24742: checking to see if all hosts have failed and the running result is not ok 30583 1726853750.24743: done checking to see if all hosts have failed 30583 1726853750.24743: getting the remaining hosts for this loop 30583 1726853750.24745: done getting the remaining hosts for this loop 30583 1726853750.24750: getting the next task for host managed_node2 30583 1726853750.24760: done getting next task for host managed_node2 30583 1726853750.24769: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853750.24777: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853750.24809: getting variables 30583 1726853750.24811: in VariableManager get_vars() 30583 1726853750.24861: Calling all_inventory to load vars for managed_node2 30583 1726853750.24865: Calling groups_inventory to load vars for managed_node2 30583 1726853750.24867: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853750.25183: Calling all_plugins_play to load vars for managed_node2 30583 1726853750.25186: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853750.25189: Calling groups_plugins_play to load vars for managed_node2 30583 1726853750.28342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853750.31670: done with get_vars() 30583 1726853750.31715: done getting variables 30583 1726853750.31783: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:50 -0400 (0:00:00.099) 0:01:25.655 ****** 30583 1726853750.31828: entering _queue_task() for managed_node2/fail 30583 1726853750.32203: worker is 1 (out of 1 available) 30583 1726853750.32215: exiting _queue_task() for managed_node2/fail 30583 1726853750.32228: done queuing things up, now waiting for results queue to drain 30583 1726853750.32229: waiting for pending results... 30583 1726853750.32595: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853750.32787: in run() - task 02083763-bbaf-05ea-abc5-0000000019c3 30583 1726853750.32801: variable 'ansible_search_path' from source: unknown 30583 1726853750.32804: variable 'ansible_search_path' from source: unknown 30583 1726853750.32821: calling self._execute() 30583 1726853750.32942: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853750.32954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853750.33017: variable 'omit' from source: magic vars 30583 1726853750.33390: variable 'ansible_distribution_major_version' from source: facts 30583 1726853750.33408: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853750.33605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853750.37433: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853750.37515: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853750.37641: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853750.37644: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853750.37647: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853750.37719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853750.37769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853750.37805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.37860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853750.37885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853750.38000: variable 'ansible_distribution_major_version' from source: facts 30583 1726853750.38021: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853750.38403: variable 'ansible_distribution' from source: facts 30583 1726853750.38406: variable '__network_rh_distros' from source: role '' defaults 30583 1726853750.38408: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853750.38841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853750.38868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853750.38944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.39053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853750.39074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853750.39192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853750.39227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853750.39300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.39432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853750.39440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853750.39547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853750.39575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853750.39675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.39719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853750.39775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853750.40443: variable 'network_connections' from source: include params 30583 1726853750.40572: variable 'interface' from source: play vars 30583 1726853750.40604: variable 'interface' from source: play vars 30583 1726853750.40694: variable 'network_state' from source: role '' defaults 30583 1726853750.40823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853750.41076: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853750.41123: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853750.41158: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853750.41197: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853750.41248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853750.41278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853750.41317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.41386: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853750.41390: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853750.41392: when evaluation is False, skipping this task 30583 1726853750.41394: _execute() done 30583 1726853750.41400: dumping result to json 30583 1726853750.41407: done dumping result, returning 30583 1726853750.41418: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-0000000019c3] 30583 1726853750.41427: sending task result for task 02083763-bbaf-05ea-abc5-0000000019c3 30583 1726853750.41714: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019c3 30583 1726853750.41717: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853750.41768: no more pending results, returning what we have 30583 1726853750.41774: results queue empty 30583 1726853750.41776: checking for any_errors_fatal 30583 1726853750.41781: done checking for any_errors_fatal 30583 1726853750.41782: checking for max_fail_percentage 30583 1726853750.41785: done checking for max_fail_percentage 30583 1726853750.41786: checking to see if all hosts have failed and the running result is not ok 30583 1726853750.41786: done checking to see if all hosts have failed 30583 1726853750.41787: getting the remaining hosts for this loop 30583 1726853750.41789: done getting the remaining hosts for this loop 30583 1726853750.41793: getting the next task for host managed_node2 30583 1726853750.41801: done getting next task for host managed_node2 30583 1726853750.41806: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853750.41811: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853750.41843: getting variables 30583 1726853750.41846: in VariableManager get_vars() 30583 1726853750.41894: Calling all_inventory to load vars for managed_node2 30583 1726853750.41898: Calling groups_inventory to load vars for managed_node2 30583 1726853750.41901: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853750.41911: Calling all_plugins_play to load vars for managed_node2 30583 1726853750.41914: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853750.41917: Calling groups_plugins_play to load vars for managed_node2 30583 1726853750.43523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853750.45259: done with get_vars() 30583 1726853750.45284: done getting variables 30583 1726853750.45345: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:50 -0400 (0:00:00.135) 0:01:25.791 ****** 30583 1726853750.45381: entering _queue_task() for managed_node2/dnf 30583 1726853750.45978: worker is 1 (out of 1 available) 30583 1726853750.45986: exiting _queue_task() for managed_node2/dnf 30583 1726853750.45996: done queuing things up, now waiting for results queue to drain 30583 1726853750.45997: waiting for pending results... 30583 1726853750.46128: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853750.46232: in run() - task 02083763-bbaf-05ea-abc5-0000000019c4 30583 1726853750.46252: variable 'ansible_search_path' from source: unknown 30583 1726853750.46259: variable 'ansible_search_path' from source: unknown 30583 1726853750.46299: calling self._execute() 30583 1726853750.46408: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853750.46419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853750.46441: variable 'omit' from source: magic vars 30583 1726853750.47204: variable 'ansible_distribution_major_version' from source: facts 30583 1726853750.47207: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853750.47638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853750.51709: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853750.51782: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853750.51826: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853750.51865: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853750.51903: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853750.51986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853750.52035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853750.52066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.52119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853750.52142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853750.52274: variable 'ansible_distribution' from source: facts 30583 1726853750.52286: variable 'ansible_distribution_major_version' from source: facts 30583 1726853750.52308: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853750.52449: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853750.52556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853750.52585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853750.52666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.52669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853750.52675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853750.52717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853750.52743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853750.52777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.52819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853750.52835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853750.52874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853750.52904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853750.52932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.52992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853750.52996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853750.53154: variable 'network_connections' from source: include params 30583 1726853750.53209: variable 'interface' from source: play vars 30583 1726853750.53243: variable 'interface' from source: play vars 30583 1726853750.53321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853750.53489: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853750.53536: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853750.53572: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853750.53644: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853750.53655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853750.53686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853750.53725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.53758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853750.53862: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853750.54055: variable 'network_connections' from source: include params 30583 1726853750.54064: variable 'interface' from source: play vars 30583 1726853750.54278: variable 'interface' from source: play vars 30583 1726853750.54281: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853750.54283: when evaluation is False, skipping this task 30583 1726853750.54285: _execute() done 30583 1726853750.54287: dumping result to json 30583 1726853750.54289: done dumping result, returning 30583 1726853750.54494: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000019c4] 30583 1726853750.54497: sending task result for task 02083763-bbaf-05ea-abc5-0000000019c4 30583 1726853750.54602: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019c4 30583 1726853750.54606: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853750.54667: no more pending results, returning what we have 30583 1726853750.54673: results queue empty 30583 1726853750.54674: checking for any_errors_fatal 30583 1726853750.54681: done checking for any_errors_fatal 30583 1726853750.54682: checking for max_fail_percentage 30583 1726853750.54685: done checking for max_fail_percentage 30583 1726853750.54686: checking to see if all hosts have failed and the running result is not ok 30583 1726853750.54688: done checking to see if all hosts have failed 30583 1726853750.54689: getting the remaining hosts for this loop 30583 1726853750.54691: done getting the remaining hosts for this loop 30583 1726853750.54695: getting the next task for host managed_node2 30583 1726853750.54704: done getting next task for host managed_node2 30583 1726853750.54709: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853750.54714: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853750.54742: getting variables 30583 1726853750.54744: in VariableManager get_vars() 30583 1726853750.54796: Calling all_inventory to load vars for managed_node2 30583 1726853750.54799: Calling groups_inventory to load vars for managed_node2 30583 1726853750.54801: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853750.54810: Calling all_plugins_play to load vars for managed_node2 30583 1726853750.54813: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853750.54815: Calling groups_plugins_play to load vars for managed_node2 30583 1726853750.56977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853750.59574: done with get_vars() 30583 1726853750.59607: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853750.59801: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:50 -0400 (0:00:00.144) 0:01:25.935 ****** 30583 1726853750.59931: entering _queue_task() for managed_node2/yum 30583 1726853750.60825: worker is 1 (out of 1 available) 30583 1726853750.60836: exiting _queue_task() for managed_node2/yum 30583 1726853750.60847: done queuing things up, now waiting for results queue to drain 30583 1726853750.60848: waiting for pending results... 30583 1726853750.61048: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853750.61209: in run() - task 02083763-bbaf-05ea-abc5-0000000019c5 30583 1726853750.61229: variable 'ansible_search_path' from source: unknown 30583 1726853750.61238: variable 'ansible_search_path' from source: unknown 30583 1726853750.61288: calling self._execute() 30583 1726853750.61401: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853750.61413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853750.61429: variable 'omit' from source: magic vars 30583 1726853750.61877: variable 'ansible_distribution_major_version' from source: facts 30583 1726853750.61880: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853750.62030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853750.66020: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853750.66100: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853750.66146: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853750.66189: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853750.66254: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853750.66310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853750.66904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853750.66909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.66946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853750.66966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853750.67229: variable 'ansible_distribution_major_version' from source: facts 30583 1726853750.67248: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853750.67276: when evaluation is False, skipping this task 30583 1726853750.67284: _execute() done 30583 1726853750.67291: dumping result to json 30583 1726853750.67298: done dumping result, returning 30583 1726853750.67312: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000019c5] 30583 1726853750.67344: sending task result for task 02083763-bbaf-05ea-abc5-0000000019c5 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853750.67504: no more pending results, returning what we have 30583 1726853750.67508: results queue empty 30583 1726853750.67509: checking for any_errors_fatal 30583 1726853750.67516: done checking for any_errors_fatal 30583 1726853750.67517: checking for max_fail_percentage 30583 1726853750.67519: done checking for max_fail_percentage 30583 1726853750.67520: checking to see if all hosts have failed and the running result is not ok 30583 1726853750.67521: done checking to see if all hosts have failed 30583 1726853750.67522: getting the remaining hosts for this loop 30583 1726853750.67524: done getting the remaining hosts for this loop 30583 1726853750.67528: getting the next task for host managed_node2 30583 1726853750.67538: done getting next task for host managed_node2 30583 1726853750.67541: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853750.67547: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853750.67575: getting variables 30583 1726853750.67578: in VariableManager get_vars() 30583 1726853750.67622: Calling all_inventory to load vars for managed_node2 30583 1726853750.67625: Calling groups_inventory to load vars for managed_node2 30583 1726853750.67628: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853750.67638: Calling all_plugins_play to load vars for managed_node2 30583 1726853750.67641: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853750.67644: Calling groups_plugins_play to load vars for managed_node2 30583 1726853750.68669: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019c5 30583 1726853750.68675: WORKER PROCESS EXITING 30583 1726853750.71718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853750.76222: done with get_vars() 30583 1726853750.76255: done getting variables 30583 1726853750.76321: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:50 -0400 (0:00:00.165) 0:01:26.100 ****** 30583 1726853750.76360: entering _queue_task() for managed_node2/fail 30583 1726853750.76870: worker is 1 (out of 1 available) 30583 1726853750.76883: exiting _queue_task() for managed_node2/fail 30583 1726853750.76894: done queuing things up, now waiting for results queue to drain 30583 1726853750.76895: waiting for pending results... 30583 1726853750.77189: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853750.77578: in run() - task 02083763-bbaf-05ea-abc5-0000000019c6 30583 1726853750.77650: variable 'ansible_search_path' from source: unknown 30583 1726853750.77687: variable 'ansible_search_path' from source: unknown 30583 1726853750.77732: calling self._execute() 30583 1726853750.78146: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853750.78151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853750.78155: variable 'omit' from source: magic vars 30583 1726853750.79302: variable 'ansible_distribution_major_version' from source: facts 30583 1726853750.79318: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853750.79553: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853750.79963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853750.83739: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853750.83919: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853750.84012: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853750.84063: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853750.84101: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853750.84191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853750.84242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853750.84276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.84331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853750.84352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853750.84408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853750.84434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853750.84535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.84677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853750.84683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853750.84731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853750.84879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853750.84882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.84885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853750.84952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853750.85359: variable 'network_connections' from source: include params 30583 1726853750.85446: variable 'interface' from source: play vars 30583 1726853750.85634: variable 'interface' from source: play vars 30583 1726853750.85797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853750.86190: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853750.86294: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853750.86323: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853750.86381: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853750.86502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853750.86559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853750.86729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853750.86732: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853750.86815: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853750.87321: variable 'network_connections' from source: include params 30583 1726853750.87485: variable 'interface' from source: play vars 30583 1726853750.87593: variable 'interface' from source: play vars 30583 1726853750.87675: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853750.87685: when evaluation is False, skipping this task 30583 1726853750.87693: _execute() done 30583 1726853750.87700: dumping result to json 30583 1726853750.87714: done dumping result, returning 30583 1726853750.87727: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000019c6] 30583 1726853750.87735: sending task result for task 02083763-bbaf-05ea-abc5-0000000019c6 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853750.87913: no more pending results, returning what we have 30583 1726853750.87917: results queue empty 30583 1726853750.87918: checking for any_errors_fatal 30583 1726853750.87925: done checking for any_errors_fatal 30583 1726853750.87926: checking for max_fail_percentage 30583 1726853750.87928: done checking for max_fail_percentage 30583 1726853750.87929: checking to see if all hosts have failed and the running result is not ok 30583 1726853750.87930: done checking to see if all hosts have failed 30583 1726853750.87930: getting the remaining hosts for this loop 30583 1726853750.87933: done getting the remaining hosts for this loop 30583 1726853750.87937: getting the next task for host managed_node2 30583 1726853750.87945: done getting next task for host managed_node2 30583 1726853750.87950: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853750.87955: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853750.88192: getting variables 30583 1726853750.88194: in VariableManager get_vars() 30583 1726853750.88241: Calling all_inventory to load vars for managed_node2 30583 1726853750.88244: Calling groups_inventory to load vars for managed_node2 30583 1726853750.88246: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853750.88256: Calling all_plugins_play to load vars for managed_node2 30583 1726853750.88260: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853750.88263: Calling groups_plugins_play to load vars for managed_node2 30583 1726853750.88818: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019c6 30583 1726853750.88822: WORKER PROCESS EXITING 30583 1726853750.90144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853750.93886: done with get_vars() 30583 1726853750.93921: done getting variables 30583 1726853750.94088: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:50 -0400 (0:00:00.177) 0:01:26.278 ****** 30583 1726853750.94130: entering _queue_task() for managed_node2/package 30583 1726853750.94539: worker is 1 (out of 1 available) 30583 1726853750.94552: exiting _queue_task() for managed_node2/package 30583 1726853750.94563: done queuing things up, now waiting for results queue to drain 30583 1726853750.94564: waiting for pending results... 30583 1726853750.94888: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853750.95061: in run() - task 02083763-bbaf-05ea-abc5-0000000019c7 30583 1726853750.95082: variable 'ansible_search_path' from source: unknown 30583 1726853750.95091: variable 'ansible_search_path' from source: unknown 30583 1726853750.95143: calling self._execute() 30583 1726853750.95267: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853750.95280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853750.95296: variable 'omit' from source: magic vars 30583 1726853750.96028: variable 'ansible_distribution_major_version' from source: facts 30583 1726853750.96077: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853750.96402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853750.96955: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853750.97113: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853750.97151: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853750.97284: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853750.97427: variable 'network_packages' from source: role '' defaults 30583 1726853750.97533: variable '__network_provider_setup' from source: role '' defaults 30583 1726853750.97555: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853750.97622: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853750.97641: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853750.97709: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853750.97909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853751.01682: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853751.01687: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853751.01762: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853751.01829: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853751.01927: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853751.02085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853751.02150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853751.02234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.02284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853751.02304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853751.02374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853751.02403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853751.02431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.02480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853751.02497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853751.02714: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853751.02832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853751.02879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853751.02884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.02923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853751.02941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853751.03096: variable 'ansible_python' from source: facts 30583 1726853751.03100: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853751.03146: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853751.03232: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853751.03364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853751.03393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853751.03427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.03470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853751.03493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853751.03547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853751.03640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853751.03643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.03661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853751.03683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853751.03833: variable 'network_connections' from source: include params 30583 1726853751.03845: variable 'interface' from source: play vars 30583 1726853751.04105: variable 'interface' from source: play vars 30583 1726853751.04197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853751.04207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853751.04442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.04446: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853751.04448: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853751.05043: variable 'network_connections' from source: include params 30583 1726853751.05046: variable 'interface' from source: play vars 30583 1726853751.05355: variable 'interface' from source: play vars 30583 1726853751.05526: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853751.05529: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853751.05943: variable 'network_connections' from source: include params 30583 1726853751.05946: variable 'interface' from source: play vars 30583 1726853751.06016: variable 'interface' from source: play vars 30583 1726853751.06038: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853751.06120: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853751.06429: variable 'network_connections' from source: include params 30583 1726853751.06432: variable 'interface' from source: play vars 30583 1726853751.06640: variable 'interface' from source: play vars 30583 1726853751.06643: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853751.06645: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853751.06648: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853751.06698: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853751.06920: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853751.07478: variable 'network_connections' from source: include params 30583 1726853751.07481: variable 'interface' from source: play vars 30583 1726853751.07505: variable 'interface' from source: play vars 30583 1726853751.07512: variable 'ansible_distribution' from source: facts 30583 1726853751.07515: variable '__network_rh_distros' from source: role '' defaults 30583 1726853751.07524: variable 'ansible_distribution_major_version' from source: facts 30583 1726853751.07538: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853751.07697: variable 'ansible_distribution' from source: facts 30583 1726853751.07701: variable '__network_rh_distros' from source: role '' defaults 30583 1726853751.07703: variable 'ansible_distribution_major_version' from source: facts 30583 1726853751.07720: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853751.07977: variable 'ansible_distribution' from source: facts 30583 1726853751.07981: variable '__network_rh_distros' from source: role '' defaults 30583 1726853751.07983: variable 'ansible_distribution_major_version' from source: facts 30583 1726853751.07985: variable 'network_provider' from source: set_fact 30583 1726853751.07987: variable 'ansible_facts' from source: unknown 30583 1726853751.08618: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853751.08622: when evaluation is False, skipping this task 30583 1726853751.08624: _execute() done 30583 1726853751.08627: dumping result to json 30583 1726853751.08629: done dumping result, returning 30583 1726853751.08640: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-0000000019c7] 30583 1726853751.08645: sending task result for task 02083763-bbaf-05ea-abc5-0000000019c7 30583 1726853751.08742: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019c7 30583 1726853751.08746: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853751.08806: no more pending results, returning what we have 30583 1726853751.08810: results queue empty 30583 1726853751.08811: checking for any_errors_fatal 30583 1726853751.08820: done checking for any_errors_fatal 30583 1726853751.08821: checking for max_fail_percentage 30583 1726853751.08823: done checking for max_fail_percentage 30583 1726853751.08824: checking to see if all hosts have failed and the running result is not ok 30583 1726853751.08825: done checking to see if all hosts have failed 30583 1726853751.08826: getting the remaining hosts for this loop 30583 1726853751.08828: done getting the remaining hosts for this loop 30583 1726853751.08832: getting the next task for host managed_node2 30583 1726853751.08841: done getting next task for host managed_node2 30583 1726853751.08845: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853751.08850: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853751.08878: getting variables 30583 1726853751.08880: in VariableManager get_vars() 30583 1726853751.08932: Calling all_inventory to load vars for managed_node2 30583 1726853751.08935: Calling groups_inventory to load vars for managed_node2 30583 1726853751.08938: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853751.08949: Calling all_plugins_play to load vars for managed_node2 30583 1726853751.08953: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853751.08956: Calling groups_plugins_play to load vars for managed_node2 30583 1726853751.12249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853751.14398: done with get_vars() 30583 1726853751.14431: done getting variables 30583 1726853751.14529: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:51 -0400 (0:00:00.204) 0:01:26.482 ****** 30583 1726853751.14565: entering _queue_task() for managed_node2/package 30583 1726853751.15452: worker is 1 (out of 1 available) 30583 1726853751.15466: exiting _queue_task() for managed_node2/package 30583 1726853751.15479: done queuing things up, now waiting for results queue to drain 30583 1726853751.15481: waiting for pending results... 30583 1726853751.16294: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853751.16300: in run() - task 02083763-bbaf-05ea-abc5-0000000019c8 30583 1726853751.16303: variable 'ansible_search_path' from source: unknown 30583 1726853751.16306: variable 'ansible_search_path' from source: unknown 30583 1726853751.16309: calling self._execute() 30583 1726853751.16578: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853751.16583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853751.16585: variable 'omit' from source: magic vars 30583 1726853751.16865: variable 'ansible_distribution_major_version' from source: facts 30583 1726853751.16869: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853751.16936: variable 'network_state' from source: role '' defaults 30583 1726853751.16948: Evaluated conditional (network_state != {}): False 30583 1726853751.16951: when evaluation is False, skipping this task 30583 1726853751.16954: _execute() done 30583 1726853751.16956: dumping result to json 30583 1726853751.16961: done dumping result, returning 30583 1726853751.16969: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-0000000019c8] 30583 1726853751.16978: sending task result for task 02083763-bbaf-05ea-abc5-0000000019c8 30583 1726853751.17087: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019c8 30583 1726853751.17092: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853751.17144: no more pending results, returning what we have 30583 1726853751.17148: results queue empty 30583 1726853751.17149: checking for any_errors_fatal 30583 1726853751.17157: done checking for any_errors_fatal 30583 1726853751.17158: checking for max_fail_percentage 30583 1726853751.17160: done checking for max_fail_percentage 30583 1726853751.17161: checking to see if all hosts have failed and the running result is not ok 30583 1726853751.17162: done checking to see if all hosts have failed 30583 1726853751.17162: getting the remaining hosts for this loop 30583 1726853751.17164: done getting the remaining hosts for this loop 30583 1726853751.17168: getting the next task for host managed_node2 30583 1726853751.17184: done getting next task for host managed_node2 30583 1726853751.17190: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853751.17196: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853751.17224: getting variables 30583 1726853751.17226: in VariableManager get_vars() 30583 1726853751.17375: Calling all_inventory to load vars for managed_node2 30583 1726853751.17380: Calling groups_inventory to load vars for managed_node2 30583 1726853751.17383: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853751.17394: Calling all_plugins_play to load vars for managed_node2 30583 1726853751.17402: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853751.17406: Calling groups_plugins_play to load vars for managed_node2 30583 1726853751.19842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853751.22317: done with get_vars() 30583 1726853751.22435: done getting variables 30583 1726853751.22503: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:51 -0400 (0:00:00.080) 0:01:26.563 ****** 30583 1726853751.22653: entering _queue_task() for managed_node2/package 30583 1726853751.23680: worker is 1 (out of 1 available) 30583 1726853751.23697: exiting _queue_task() for managed_node2/package 30583 1726853751.23709: done queuing things up, now waiting for results queue to drain 30583 1726853751.23710: waiting for pending results... 30583 1726853751.24493: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853751.24778: in run() - task 02083763-bbaf-05ea-abc5-0000000019c9 30583 1726853751.24782: variable 'ansible_search_path' from source: unknown 30583 1726853751.24785: variable 'ansible_search_path' from source: unknown 30583 1726853751.24869: calling self._execute() 30583 1726853751.25477: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853751.25484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853751.25486: variable 'omit' from source: magic vars 30583 1726853751.26479: variable 'ansible_distribution_major_version' from source: facts 30583 1726853751.26516: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853751.27046: variable 'network_state' from source: role '' defaults 30583 1726853751.27049: Evaluated conditional (network_state != {}): False 30583 1726853751.27052: when evaluation is False, skipping this task 30583 1726853751.27055: _execute() done 30583 1726853751.27057: dumping result to json 30583 1726853751.27062: done dumping result, returning 30583 1726853751.27064: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-0000000019c9] 30583 1726853751.27066: sending task result for task 02083763-bbaf-05ea-abc5-0000000019c9 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853751.27398: no more pending results, returning what we have 30583 1726853751.27402: results queue empty 30583 1726853751.27403: checking for any_errors_fatal 30583 1726853751.27413: done checking for any_errors_fatal 30583 1726853751.27414: checking for max_fail_percentage 30583 1726853751.27417: done checking for max_fail_percentage 30583 1726853751.27418: checking to see if all hosts have failed and the running result is not ok 30583 1726853751.27418: done checking to see if all hosts have failed 30583 1726853751.27419: getting the remaining hosts for this loop 30583 1726853751.27421: done getting the remaining hosts for this loop 30583 1726853751.27425: getting the next task for host managed_node2 30583 1726853751.27437: done getting next task for host managed_node2 30583 1726853751.27446: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853751.27452: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853751.27484: getting variables 30583 1726853751.27486: in VariableManager get_vars() 30583 1726853751.27533: Calling all_inventory to load vars for managed_node2 30583 1726853751.27536: Calling groups_inventory to load vars for managed_node2 30583 1726853751.27538: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853751.27664: Calling all_plugins_play to load vars for managed_node2 30583 1726853751.27669: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853751.27675: Calling groups_plugins_play to load vars for managed_node2 30583 1726853751.28354: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019c9 30583 1726853751.28361: WORKER PROCESS EXITING 30583 1726853751.30557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853751.32366: done with get_vars() 30583 1726853751.32405: done getting variables 30583 1726853751.32477: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:51 -0400 (0:00:00.098) 0:01:26.662 ****** 30583 1726853751.32514: entering _queue_task() for managed_node2/service 30583 1726853751.33088: worker is 1 (out of 1 available) 30583 1726853751.33104: exiting _queue_task() for managed_node2/service 30583 1726853751.33118: done queuing things up, now waiting for results queue to drain 30583 1726853751.33119: waiting for pending results... 30583 1726853751.33941: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853751.34148: in run() - task 02083763-bbaf-05ea-abc5-0000000019ca 30583 1726853751.34160: variable 'ansible_search_path' from source: unknown 30583 1726853751.34166: variable 'ansible_search_path' from source: unknown 30583 1726853751.34215: calling self._execute() 30583 1726853751.34434: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853751.34438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853751.34442: variable 'omit' from source: magic vars 30583 1726853751.34826: variable 'ansible_distribution_major_version' from source: facts 30583 1726853751.34829: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853751.35079: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853751.35777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853751.38196: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853751.38563: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853751.38567: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853751.38569: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853751.38573: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853751.38876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853751.38924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853751.38955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.39006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853751.39031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853751.39122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853751.39255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853751.39315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.39439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853751.39463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853751.39622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853751.39625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853751.39831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.39834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853751.39837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853751.40193: variable 'network_connections' from source: include params 30583 1726853751.40210: variable 'interface' from source: play vars 30583 1726853751.40392: variable 'interface' from source: play vars 30583 1726853751.40526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853751.41052: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853751.41178: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853751.41289: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853751.41323: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853751.41578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853751.41581: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853751.41583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.41584: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853751.41690: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853751.41983: variable 'network_connections' from source: include params 30583 1726853751.41993: variable 'interface' from source: play vars 30583 1726853751.42069: variable 'interface' from source: play vars 30583 1726853751.42101: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853751.42110: when evaluation is False, skipping this task 30583 1726853751.42121: _execute() done 30583 1726853751.42126: dumping result to json 30583 1726853751.42137: done dumping result, returning 30583 1726853751.42148: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000019ca] 30583 1726853751.42157: sending task result for task 02083763-bbaf-05ea-abc5-0000000019ca 30583 1726853751.42480: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019ca 30583 1726853751.42491: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853751.42541: no more pending results, returning what we have 30583 1726853751.42544: results queue empty 30583 1726853751.42545: checking for any_errors_fatal 30583 1726853751.42550: done checking for any_errors_fatal 30583 1726853751.42551: checking for max_fail_percentage 30583 1726853751.42553: done checking for max_fail_percentage 30583 1726853751.42554: checking to see if all hosts have failed and the running result is not ok 30583 1726853751.42555: done checking to see if all hosts have failed 30583 1726853751.42556: getting the remaining hosts for this loop 30583 1726853751.42561: done getting the remaining hosts for this loop 30583 1726853751.42565: getting the next task for host managed_node2 30583 1726853751.42576: done getting next task for host managed_node2 30583 1726853751.42580: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853751.42584: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853751.42610: getting variables 30583 1726853751.42611: in VariableManager get_vars() 30583 1726853751.42655: Calling all_inventory to load vars for managed_node2 30583 1726853751.42657: Calling groups_inventory to load vars for managed_node2 30583 1726853751.42662: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853751.42675: Calling all_plugins_play to load vars for managed_node2 30583 1726853751.42679: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853751.42682: Calling groups_plugins_play to load vars for managed_node2 30583 1726853751.46899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853751.49219: done with get_vars() 30583 1726853751.49276: done getting variables 30583 1726853751.49346: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:51 -0400 (0:00:00.168) 0:01:26.831 ****** 30583 1726853751.49405: entering _queue_task() for managed_node2/service 30583 1726853751.50161: worker is 1 (out of 1 available) 30583 1726853751.50175: exiting _queue_task() for managed_node2/service 30583 1726853751.50187: done queuing things up, now waiting for results queue to drain 30583 1726853751.50189: waiting for pending results... 30583 1726853751.50532: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853751.50875: in run() - task 02083763-bbaf-05ea-abc5-0000000019cb 30583 1726853751.51015: variable 'ansible_search_path' from source: unknown 30583 1726853751.51019: variable 'ansible_search_path' from source: unknown 30583 1726853751.51029: calling self._execute() 30583 1726853751.51342: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853751.51345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853751.51348: variable 'omit' from source: magic vars 30583 1726853751.51761: variable 'ansible_distribution_major_version' from source: facts 30583 1726853751.51802: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853751.52056: variable 'network_provider' from source: set_fact 30583 1726853751.52074: variable 'network_state' from source: role '' defaults 30583 1726853751.52091: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853751.52110: variable 'omit' from source: magic vars 30583 1726853751.52189: variable 'omit' from source: magic vars 30583 1726853751.52232: variable 'network_service_name' from source: role '' defaults 30583 1726853751.52310: variable 'network_service_name' from source: role '' defaults 30583 1726853751.52436: variable '__network_provider_setup' from source: role '' defaults 30583 1726853751.52447: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853751.52524: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853751.52579: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853751.52614: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853751.52980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853751.54997: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853751.55064: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853751.55104: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853751.55142: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853751.55175: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853751.55261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853751.55292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853751.55324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.55364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853751.55380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853751.55432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853751.55453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853751.55480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.55520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853751.55537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853751.55776: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853751.55891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853751.55914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853751.55978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.55989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853751.56003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853751.56096: variable 'ansible_python' from source: facts 30583 1726853751.56141: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853751.56376: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853751.56379: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853751.56525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853751.56529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853751.56531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.56563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853751.56576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853751.56646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853751.56660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853751.56663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.56694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853751.56708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853751.56885: variable 'network_connections' from source: include params 30583 1726853751.56889: variable 'interface' from source: play vars 30583 1726853751.56912: variable 'interface' from source: play vars 30583 1726853751.57012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853751.57190: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853751.57238: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853751.57282: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853751.57319: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853751.57392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853751.57422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853751.57541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853751.57544: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853751.57547: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853751.57796: variable 'network_connections' from source: include params 30583 1726853751.57802: variable 'interface' from source: play vars 30583 1726853751.57877: variable 'interface' from source: play vars 30583 1726853751.57909: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853751.57986: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853751.58330: variable 'network_connections' from source: include params 30583 1726853751.58333: variable 'interface' from source: play vars 30583 1726853751.58414: variable 'interface' from source: play vars 30583 1726853751.58434: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853751.58513: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853751.58814: variable 'network_connections' from source: include params 30583 1726853751.58817: variable 'interface' from source: play vars 30583 1726853751.58917: variable 'interface' from source: play vars 30583 1726853751.59036: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853751.59440: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853751.59443: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853751.59445: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853751.59447: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853751.59839: variable 'network_connections' from source: include params 30583 1726853751.59843: variable 'interface' from source: play vars 30583 1726853751.60078: variable 'interface' from source: play vars 30583 1726853751.60081: variable 'ansible_distribution' from source: facts 30583 1726853751.60083: variable '__network_rh_distros' from source: role '' defaults 30583 1726853751.60085: variable 'ansible_distribution_major_version' from source: facts 30583 1726853751.60087: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853751.60124: variable 'ansible_distribution' from source: facts 30583 1726853751.60127: variable '__network_rh_distros' from source: role '' defaults 30583 1726853751.60132: variable 'ansible_distribution_major_version' from source: facts 30583 1726853751.60146: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853751.60399: variable 'ansible_distribution' from source: facts 30583 1726853751.60402: variable '__network_rh_distros' from source: role '' defaults 30583 1726853751.60405: variable 'ansible_distribution_major_version' from source: facts 30583 1726853751.60445: variable 'network_provider' from source: set_fact 30583 1726853751.60478: variable 'omit' from source: magic vars 30583 1726853751.60508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853751.60537: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853751.60564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853751.60579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853751.60592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853751.60626: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853751.60629: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853751.60631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853751.60764: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853751.60769: Set connection var ansible_timeout to 10 30583 1726853751.60801: Set connection var ansible_connection to ssh 30583 1726853751.60804: Set connection var ansible_shell_executable to /bin/sh 30583 1726853751.60806: Set connection var ansible_shell_type to sh 30583 1726853751.60808: Set connection var ansible_pipelining to False 30583 1726853751.60844: variable 'ansible_shell_executable' from source: unknown 30583 1726853751.60847: variable 'ansible_connection' from source: unknown 30583 1726853751.60849: variable 'ansible_module_compression' from source: unknown 30583 1726853751.60851: variable 'ansible_shell_type' from source: unknown 30583 1726853751.60854: variable 'ansible_shell_executable' from source: unknown 30583 1726853751.60856: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853751.60860: variable 'ansible_pipelining' from source: unknown 30583 1726853751.60862: variable 'ansible_timeout' from source: unknown 30583 1726853751.60864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853751.61019: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853751.61027: variable 'omit' from source: magic vars 30583 1726853751.61030: starting attempt loop 30583 1726853751.61033: running the handler 30583 1726853751.61100: variable 'ansible_facts' from source: unknown 30583 1726853751.62379: _low_level_execute_command(): starting 30583 1726853751.62382: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853751.62716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853751.62784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853751.62807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853751.62818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853751.63036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853751.64665: stdout chunk (state=3): >>>/root <<< 30583 1726853751.64874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853751.64878: stderr chunk (state=3): >>><<< 30583 1726853751.64880: stdout chunk (state=3): >>><<< 30583 1726853751.64884: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853751.64887: _low_level_execute_command(): starting 30583 1726853751.64890: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739 `" && echo ansible-tmp-1726853751.6483338-34620-242209012801739="` echo /root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739 `" ) && sleep 0' 30583 1726853751.65460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853751.65466: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853751.65527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853751.65534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853751.65588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853751.65630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853751.65653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853751.65768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853751.67757: stdout chunk (state=3): >>>ansible-tmp-1726853751.6483338-34620-242209012801739=/root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739 <<< 30583 1726853751.67912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853751.68176: stdout chunk (state=3): >>><<< 30583 1726853751.68179: stderr chunk (state=3): >>><<< 30583 1726853751.68182: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853751.6483338-34620-242209012801739=/root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853751.68185: variable 'ansible_module_compression' from source: unknown 30583 1726853751.68187: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853751.68190: variable 'ansible_facts' from source: unknown 30583 1726853751.68514: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739/AnsiballZ_systemd.py 30583 1726853751.68695: Sending initial data 30583 1726853751.68698: Sent initial data (156 bytes) 30583 1726853751.69256: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853751.69265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853751.69278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853751.69292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853751.69304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853751.69386: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853751.69396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853751.69408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853751.69416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853751.69518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853751.71318: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853751.71390: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853751.71505: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp5u6ak0z_ /root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739/AnsiballZ_systemd.py <<< 30583 1726853751.71509: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739/AnsiballZ_systemd.py" <<< 30583 1726853751.71573: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp5u6ak0z_" to remote "/root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739/AnsiballZ_systemd.py" <<< 30583 1726853751.73840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853751.74034: stderr chunk (state=3): >>><<< 30583 1726853751.74037: stdout chunk (state=3): >>><<< 30583 1726853751.74068: done transferring module to remote 30583 1726853751.74356: _low_level_execute_command(): starting 30583 1726853751.74362: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739/ /root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739/AnsiballZ_systemd.py && sleep 0' 30583 1726853751.75320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853751.75347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853751.75629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853751.75640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853751.75701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853751.77597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853751.77629: stderr chunk (state=3): >>><<< 30583 1726853751.77638: stdout chunk (state=3): >>><<< 30583 1726853751.77662: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853751.77674: _low_level_execute_command(): starting 30583 1726853751.77688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739/AnsiballZ_systemd.py && sleep 0' 30583 1726853751.79093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853751.79236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853751.79304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853751.79494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853751.79634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853751.79778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853752.09497: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4657152", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304665088", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1943516000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 30583 1726853752.09529: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "syst<<< 30583 1726853752.09555: stdout chunk (state=3): >>>em.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853752.11628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853752.11631: stdout chunk (state=3): >>><<< 30583 1726853752.11633: stderr chunk (state=3): >>><<< 30583 1726853752.11690: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4657152", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304665088", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1943516000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853752.11880: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853752.11907: _low_level_execute_command(): starting 30583 1726853752.11917: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853751.6483338-34620-242209012801739/ > /dev/null 2>&1 && sleep 0' 30583 1726853752.12573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853752.12577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853752.12580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853752.12592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853752.12632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853752.12636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853752.12699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853752.12747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853752.12818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853752.15275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853752.15280: stdout chunk (state=3): >>><<< 30583 1726853752.15282: stderr chunk (state=3): >>><<< 30583 1726853752.15285: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853752.15287: handler run complete 30583 1726853752.15290: attempt loop complete, returning result 30583 1726853752.15292: _execute() done 30583 1726853752.15294: dumping result to json 30583 1726853752.15296: done dumping result, returning 30583 1726853752.15298: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-0000000019cb] 30583 1726853752.15300: sending task result for task 02083763-bbaf-05ea-abc5-0000000019cb ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853752.15710: no more pending results, returning what we have 30583 1726853752.15713: results queue empty 30583 1726853752.15714: checking for any_errors_fatal 30583 1726853752.15721: done checking for any_errors_fatal 30583 1726853752.15721: checking for max_fail_percentage 30583 1726853752.15723: done checking for max_fail_percentage 30583 1726853752.15724: checking to see if all hosts have failed and the running result is not ok 30583 1726853752.15725: done checking to see if all hosts have failed 30583 1726853752.15726: getting the remaining hosts for this loop 30583 1726853752.15728: done getting the remaining hosts for this loop 30583 1726853752.15732: getting the next task for host managed_node2 30583 1726853752.15740: done getting next task for host managed_node2 30583 1726853752.15744: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853752.15749: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853752.15766: getting variables 30583 1726853752.15768: in VariableManager get_vars() 30583 1726853752.16044: Calling all_inventory to load vars for managed_node2 30583 1726853752.16047: Calling groups_inventory to load vars for managed_node2 30583 1726853752.16050: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853752.16283: Calling all_plugins_play to load vars for managed_node2 30583 1726853752.16287: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853752.16290: Calling groups_plugins_play to load vars for managed_node2 30583 1726853752.16917: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019cb 30583 1726853752.16921: WORKER PROCESS EXITING 30583 1726853752.19496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853752.22769: done with get_vars() 30583 1726853752.22872: done getting variables 30583 1726853752.23051: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:35:52 -0400 (0:00:00.736) 0:01:27.568 ****** 30583 1726853752.23094: entering _queue_task() for managed_node2/service 30583 1726853752.23703: worker is 1 (out of 1 available) 30583 1726853752.23831: exiting _queue_task() for managed_node2/service 30583 1726853752.23843: done queuing things up, now waiting for results queue to drain 30583 1726853752.23845: waiting for pending results... 30583 1726853752.24072: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853752.24224: in run() - task 02083763-bbaf-05ea-abc5-0000000019cc 30583 1726853752.24237: variable 'ansible_search_path' from source: unknown 30583 1726853752.24241: variable 'ansible_search_path' from source: unknown 30583 1726853752.24290: calling self._execute() 30583 1726853752.24700: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853752.24703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853752.24706: variable 'omit' from source: magic vars 30583 1726853752.24824: variable 'ansible_distribution_major_version' from source: facts 30583 1726853752.24836: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853752.24964: variable 'network_provider' from source: set_fact 30583 1726853752.24967: Evaluated conditional (network_provider == "nm"): True 30583 1726853752.25061: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853752.25160: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853752.25325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853752.39223: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853752.39314: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853752.39402: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853752.39435: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853752.39486: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853752.39639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853752.39668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853752.39825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853752.39867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853752.39882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853752.40000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853752.40144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853752.40168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853752.40206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853752.40219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853752.40374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853752.40400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853752.40422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853752.40494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853752.40510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853752.40742: variable 'network_connections' from source: include params 30583 1726853752.40753: variable 'interface' from source: play vars 30583 1726853752.40851: variable 'interface' from source: play vars 30583 1726853752.40961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853752.41375: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853752.41379: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853752.41381: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853752.41383: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853752.41399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853752.41420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853752.41599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853752.41623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853752.41660: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853752.42233: variable 'network_connections' from source: include params 30583 1726853752.42237: variable 'interface' from source: play vars 30583 1726853752.42386: variable 'interface' from source: play vars 30583 1726853752.42417: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853752.42466: when evaluation is False, skipping this task 30583 1726853752.42469: _execute() done 30583 1726853752.42474: dumping result to json 30583 1726853752.42476: done dumping result, returning 30583 1726853752.42485: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-0000000019cc] 30583 1726853752.42496: sending task result for task 02083763-bbaf-05ea-abc5-0000000019cc 30583 1726853752.42600: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019cc 30583 1726853752.42604: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853752.42649: no more pending results, returning what we have 30583 1726853752.42652: results queue empty 30583 1726853752.42653: checking for any_errors_fatal 30583 1726853752.42672: done checking for any_errors_fatal 30583 1726853752.42673: checking for max_fail_percentage 30583 1726853752.42676: done checking for max_fail_percentage 30583 1726853752.42677: checking to see if all hosts have failed and the running result is not ok 30583 1726853752.42677: done checking to see if all hosts have failed 30583 1726853752.42678: getting the remaining hosts for this loop 30583 1726853752.42680: done getting the remaining hosts for this loop 30583 1726853752.42684: getting the next task for host managed_node2 30583 1726853752.42692: done getting next task for host managed_node2 30583 1726853752.42696: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853752.42700: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853752.42730: getting variables 30583 1726853752.42732: in VariableManager get_vars() 30583 1726853752.42889: Calling all_inventory to load vars for managed_node2 30583 1726853752.42892: Calling groups_inventory to load vars for managed_node2 30583 1726853752.42895: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853752.42905: Calling all_plugins_play to load vars for managed_node2 30583 1726853752.42908: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853752.42910: Calling groups_plugins_play to load vars for managed_node2 30583 1726853752.51815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853752.53479: done with get_vars() 30583 1726853752.53508: done getting variables 30583 1726853752.53561: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:35:52 -0400 (0:00:00.304) 0:01:27.873 ****** 30583 1726853752.53596: entering _queue_task() for managed_node2/service 30583 1726853752.53976: worker is 1 (out of 1 available) 30583 1726853752.53990: exiting _queue_task() for managed_node2/service 30583 1726853752.54004: done queuing things up, now waiting for results queue to drain 30583 1726853752.54006: waiting for pending results... 30583 1726853752.54592: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853752.54714: in run() - task 02083763-bbaf-05ea-abc5-0000000019cd 30583 1726853752.54733: variable 'ansible_search_path' from source: unknown 30583 1726853752.54738: variable 'ansible_search_path' from source: unknown 30583 1726853752.54835: calling self._execute() 30583 1726853752.54880: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853752.54885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853752.54896: variable 'omit' from source: magic vars 30583 1726853752.55345: variable 'ansible_distribution_major_version' from source: facts 30583 1726853752.55357: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853752.55690: variable 'network_provider' from source: set_fact 30583 1726853752.55875: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853752.55880: when evaluation is False, skipping this task 30583 1726853752.55882: _execute() done 30583 1726853752.55885: dumping result to json 30583 1726853752.55888: done dumping result, returning 30583 1726853752.55890: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-0000000019cd] 30583 1726853752.55893: sending task result for task 02083763-bbaf-05ea-abc5-0000000019cd 30583 1726853752.55969: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019cd 30583 1726853752.55974: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853752.56021: no more pending results, returning what we have 30583 1726853752.56025: results queue empty 30583 1726853752.56026: checking for any_errors_fatal 30583 1726853752.56039: done checking for any_errors_fatal 30583 1726853752.56040: checking for max_fail_percentage 30583 1726853752.56042: done checking for max_fail_percentage 30583 1726853752.56043: checking to see if all hosts have failed and the running result is not ok 30583 1726853752.56044: done checking to see if all hosts have failed 30583 1726853752.56045: getting the remaining hosts for this loop 30583 1726853752.56047: done getting the remaining hosts for this loop 30583 1726853752.56051: getting the next task for host managed_node2 30583 1726853752.56064: done getting next task for host managed_node2 30583 1726853752.56070: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853752.56079: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853752.56111: getting variables 30583 1726853752.56113: in VariableManager get_vars() 30583 1726853752.56163: Calling all_inventory to load vars for managed_node2 30583 1726853752.56166: Calling groups_inventory to load vars for managed_node2 30583 1726853752.56168: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853752.56382: Calling all_plugins_play to load vars for managed_node2 30583 1726853752.56385: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853752.56388: Calling groups_plugins_play to load vars for managed_node2 30583 1726853752.57687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853752.59281: done with get_vars() 30583 1726853752.59304: done getting variables 30583 1726853752.59361: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:35:52 -0400 (0:00:00.057) 0:01:27.931 ****** 30583 1726853752.59398: entering _queue_task() for managed_node2/copy 30583 1726853752.59747: worker is 1 (out of 1 available) 30583 1726853752.59763: exiting _queue_task() for managed_node2/copy 30583 1726853752.59979: done queuing things up, now waiting for results queue to drain 30583 1726853752.59981: waiting for pending results... 30583 1726853752.60095: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853752.60278: in run() - task 02083763-bbaf-05ea-abc5-0000000019ce 30583 1726853752.60322: variable 'ansible_search_path' from source: unknown 30583 1726853752.60326: variable 'ansible_search_path' from source: unknown 30583 1726853752.60350: calling self._execute() 30583 1726853752.60467: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853752.60541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853752.60548: variable 'omit' from source: magic vars 30583 1726853752.60923: variable 'ansible_distribution_major_version' from source: facts 30583 1726853752.60939: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853752.61073: variable 'network_provider' from source: set_fact 30583 1726853752.61089: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853752.61096: when evaluation is False, skipping this task 30583 1726853752.61104: _execute() done 30583 1726853752.61112: dumping result to json 30583 1726853752.61119: done dumping result, returning 30583 1726853752.61132: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-0000000019ce] 30583 1726853752.61140: sending task result for task 02083763-bbaf-05ea-abc5-0000000019ce 30583 1726853752.61269: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019ce 30583 1726853752.61274: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853752.61343: no more pending results, returning what we have 30583 1726853752.61348: results queue empty 30583 1726853752.61349: checking for any_errors_fatal 30583 1726853752.61356: done checking for any_errors_fatal 30583 1726853752.61357: checking for max_fail_percentage 30583 1726853752.61362: done checking for max_fail_percentage 30583 1726853752.61363: checking to see if all hosts have failed and the running result is not ok 30583 1726853752.61364: done checking to see if all hosts have failed 30583 1726853752.61364: getting the remaining hosts for this loop 30583 1726853752.61366: done getting the remaining hosts for this loop 30583 1726853752.61370: getting the next task for host managed_node2 30583 1726853752.61382: done getting next task for host managed_node2 30583 1726853752.61387: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853752.61393: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853752.61421: getting variables 30583 1726853752.61423: in VariableManager get_vars() 30583 1726853752.61575: Calling all_inventory to load vars for managed_node2 30583 1726853752.61578: Calling groups_inventory to load vars for managed_node2 30583 1726853752.61581: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853752.61593: Calling all_plugins_play to load vars for managed_node2 30583 1726853752.61597: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853752.61605: Calling groups_plugins_play to load vars for managed_node2 30583 1726853752.63324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853752.65751: done with get_vars() 30583 1726853752.65783: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:35:52 -0400 (0:00:00.064) 0:01:27.996 ****** 30583 1726853752.65879: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853752.66237: worker is 1 (out of 1 available) 30583 1726853752.66250: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853752.66269: done queuing things up, now waiting for results queue to drain 30583 1726853752.66273: waiting for pending results... 30583 1726853752.66497: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853752.66631: in run() - task 02083763-bbaf-05ea-abc5-0000000019cf 30583 1726853752.66642: variable 'ansible_search_path' from source: unknown 30583 1726853752.66646: variable 'ansible_search_path' from source: unknown 30583 1726853752.66680: calling self._execute() 30583 1726853752.66961: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853752.66964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853752.66966: variable 'omit' from source: magic vars 30583 1726853752.67878: variable 'ansible_distribution_major_version' from source: facts 30583 1726853752.67882: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853752.67885: variable 'omit' from source: magic vars 30583 1726853752.67887: variable 'omit' from source: magic vars 30583 1726853752.68117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853752.70934: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853752.71020: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853752.71062: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853752.71105: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853752.71139: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853752.71229: variable 'network_provider' from source: set_fact 30583 1726853752.71368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853752.71404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853752.71433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853752.71484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853752.71503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853752.71584: variable 'omit' from source: magic vars 30583 1726853752.71698: variable 'omit' from source: magic vars 30583 1726853752.71805: variable 'network_connections' from source: include params 30583 1726853752.71820: variable 'interface' from source: play vars 30583 1726853752.71886: variable 'interface' from source: play vars 30583 1726853752.72034: variable 'omit' from source: magic vars 30583 1726853752.72047: variable '__lsr_ansible_managed' from source: task vars 30583 1726853752.72114: variable '__lsr_ansible_managed' from source: task vars 30583 1726853752.72311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853752.72515: Loaded config def from plugin (lookup/template) 30583 1726853752.72529: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853752.72559: File lookup term: get_ansible_managed.j2 30583 1726853752.72567: variable 'ansible_search_path' from source: unknown 30583 1726853752.72638: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853752.72643: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853752.72646: variable 'ansible_search_path' from source: unknown 30583 1726853752.81408: variable 'ansible_managed' from source: unknown 30583 1726853752.81604: variable 'omit' from source: magic vars 30583 1726853752.81608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853752.81628: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853752.81652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853752.81676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853752.81708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853752.81743: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853752.81775: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853752.81779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853752.81855: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853752.81867: Set connection var ansible_timeout to 10 30583 1726853752.81875: Set connection var ansible_connection to ssh 30583 1726853752.81906: Set connection var ansible_shell_executable to /bin/sh 30583 1726853752.81910: Set connection var ansible_shell_type to sh 30583 1726853752.81932: Set connection var ansible_pipelining to False 30583 1726853752.82042: variable 'ansible_shell_executable' from source: unknown 30583 1726853752.82045: variable 'ansible_connection' from source: unknown 30583 1726853752.82047: variable 'ansible_module_compression' from source: unknown 30583 1726853752.82050: variable 'ansible_shell_type' from source: unknown 30583 1726853752.82052: variable 'ansible_shell_executable' from source: unknown 30583 1726853752.82054: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853752.82056: variable 'ansible_pipelining' from source: unknown 30583 1726853752.82059: variable 'ansible_timeout' from source: unknown 30583 1726853752.82061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853752.82127: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853752.82230: variable 'omit' from source: magic vars 30583 1726853752.82243: starting attempt loop 30583 1726853752.82249: running the handler 30583 1726853752.82273: _low_level_execute_command(): starting 30583 1726853752.82285: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853752.83063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853752.83081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853752.83242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853752.83443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853752.83464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853752.83506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853752.83717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853752.85448: stdout chunk (state=3): >>>/root <<< 30583 1726853752.85688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853752.85691: stdout chunk (state=3): >>><<< 30583 1726853752.85694: stderr chunk (state=3): >>><<< 30583 1726853752.85697: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853752.85702: _low_level_execute_command(): starting 30583 1726853752.85714: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308 `" && echo ansible-tmp-1726853752.856699-34710-76058237199308="` echo /root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308 `" ) && sleep 0' 30583 1726853752.86716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853752.86820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853752.86849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853752.86899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853752.86924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853752.87073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853752.89098: stdout chunk (state=3): >>>ansible-tmp-1726853752.856699-34710-76058237199308=/root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308 <<< 30583 1726853752.89198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853752.89255: stderr chunk (state=3): >>><<< 30583 1726853752.89281: stdout chunk (state=3): >>><<< 30583 1726853752.89307: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853752.856699-34710-76058237199308=/root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853752.89420: variable 'ansible_module_compression' from source: unknown 30583 1726853752.89433: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853752.89469: variable 'ansible_facts' from source: unknown 30583 1726853752.89595: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308/AnsiballZ_network_connections.py 30583 1726853752.89768: Sending initial data 30583 1726853752.89775: Sent initial data (166 bytes) 30583 1726853752.90419: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853752.90482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853752.90499: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853752.90554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853752.90575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853752.90598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853752.90706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853752.92376: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853752.92473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853752.92545: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp7nyaplee /root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308/AnsiballZ_network_connections.py <<< 30583 1726853752.92549: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308/AnsiballZ_network_connections.py" <<< 30583 1726853752.92624: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp7nyaplee" to remote "/root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308/AnsiballZ_network_connections.py" <<< 30583 1726853752.93953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853752.93998: stderr chunk (state=3): >>><<< 30583 1726853752.94012: stdout chunk (state=3): >>><<< 30583 1726853752.94268: done transferring module to remote 30583 1726853752.94277: _low_level_execute_command(): starting 30583 1726853752.94280: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308/ /root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308/AnsiballZ_network_connections.py && sleep 0' 30583 1726853752.95840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853752.96396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853752.96491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853752.98438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853752.98442: stdout chunk (state=3): >>><<< 30583 1726853752.98450: stderr chunk (state=3): >>><<< 30583 1726853752.98469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853752.98476: _low_level_execute_command(): starting 30583 1726853752.98478: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308/AnsiballZ_network_connections.py && sleep 0' 30583 1726853752.99283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853752.99290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853752.99293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853752.99295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853752.99298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853752.99300: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853752.99302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853752.99304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853752.99306: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853752.99308: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853752.99310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853752.99312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853752.99313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853752.99315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853752.99391: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853752.99595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853752.99860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853753.26070: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30583 1726853753.28185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853753.28189: stdout chunk (state=3): >>><<< 30583 1726853753.28191: stderr chunk (state=3): >>><<< 30583 1726853753.28194: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853753.28232: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853753.28242: _low_level_execute_command(): starting 30583 1726853753.28245: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853752.856699-34710-76058237199308/ > /dev/null 2>&1 && sleep 0' 30583 1726853753.29507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853753.29510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853753.29513: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853753.29516: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853753.29684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853753.29687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853753.29702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853753.29846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853753.31761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853753.31789: stderr chunk (state=3): >>><<< 30583 1726853753.31795: stdout chunk (state=3): >>><<< 30583 1726853753.31892: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853753.31898: handler run complete 30583 1726853753.31926: attempt loop complete, returning result 30583 1726853753.31929: _execute() done 30583 1726853753.31931: dumping result to json 30583 1726853753.31934: done dumping result, returning 30583 1726853753.31976: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-0000000019cf] 30583 1726853753.31979: sending task result for task 02083763-bbaf-05ea-abc5-0000000019cf ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954 skipped because already active 30583 1726853753.32297: no more pending results, returning what we have 30583 1726853753.32300: results queue empty 30583 1726853753.32301: checking for any_errors_fatal 30583 1726853753.32310: done checking for any_errors_fatal 30583 1726853753.32310: checking for max_fail_percentage 30583 1726853753.32312: done checking for max_fail_percentage 30583 1726853753.32313: checking to see if all hosts have failed and the running result is not ok 30583 1726853753.32314: done checking to see if all hosts have failed 30583 1726853753.32315: getting the remaining hosts for this loop 30583 1726853753.32316: done getting the remaining hosts for this loop 30583 1726853753.32319: getting the next task for host managed_node2 30583 1726853753.32327: done getting next task for host managed_node2 30583 1726853753.32330: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853753.32334: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853753.32345: getting variables 30583 1726853753.32347: in VariableManager get_vars() 30583 1726853753.32496: Calling all_inventory to load vars for managed_node2 30583 1726853753.32500: Calling groups_inventory to load vars for managed_node2 30583 1726853753.32502: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853753.32592: Calling all_plugins_play to load vars for managed_node2 30583 1726853753.32596: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853753.32599: Calling groups_plugins_play to load vars for managed_node2 30583 1726853753.33203: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019cf 30583 1726853753.33207: WORKER PROCESS EXITING 30583 1726853753.35385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853753.39236: done with get_vars() 30583 1726853753.39263: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:35:53 -0400 (0:00:00.735) 0:01:28.731 ****** 30583 1726853753.39470: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853753.40196: worker is 1 (out of 1 available) 30583 1726853753.40209: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853753.40274: done queuing things up, now waiting for results queue to drain 30583 1726853753.40277: waiting for pending results... 30583 1726853753.40769: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853753.41017: in run() - task 02083763-bbaf-05ea-abc5-0000000019d0 30583 1726853753.41032: variable 'ansible_search_path' from source: unknown 30583 1726853753.41037: variable 'ansible_search_path' from source: unknown 30583 1726853753.41078: calling self._execute() 30583 1726853753.41272: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853753.41484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853753.41493: variable 'omit' from source: magic vars 30583 1726853753.42076: variable 'ansible_distribution_major_version' from source: facts 30583 1726853753.42195: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853753.42313: variable 'network_state' from source: role '' defaults 30583 1726853753.42324: Evaluated conditional (network_state != {}): False 30583 1726853753.42328: when evaluation is False, skipping this task 30583 1726853753.42331: _execute() done 30583 1726853753.42333: dumping result to json 30583 1726853753.42336: done dumping result, returning 30583 1726853753.42414: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-0000000019d0] 30583 1726853753.42417: sending task result for task 02083763-bbaf-05ea-abc5-0000000019d0 30583 1726853753.42774: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019d0 30583 1726853753.42777: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853753.42827: no more pending results, returning what we have 30583 1726853753.42831: results queue empty 30583 1726853753.42832: checking for any_errors_fatal 30583 1726853753.43075: done checking for any_errors_fatal 30583 1726853753.43076: checking for max_fail_percentage 30583 1726853753.43078: done checking for max_fail_percentage 30583 1726853753.43079: checking to see if all hosts have failed and the running result is not ok 30583 1726853753.43080: done checking to see if all hosts have failed 30583 1726853753.43080: getting the remaining hosts for this loop 30583 1726853753.43082: done getting the remaining hosts for this loop 30583 1726853753.43085: getting the next task for host managed_node2 30583 1726853753.43092: done getting next task for host managed_node2 30583 1726853753.43095: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853753.43102: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853753.43122: getting variables 30583 1726853753.43124: in VariableManager get_vars() 30583 1726853753.43163: Calling all_inventory to load vars for managed_node2 30583 1726853753.43166: Calling groups_inventory to load vars for managed_node2 30583 1726853753.43168: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853753.43195: Calling all_plugins_play to load vars for managed_node2 30583 1726853753.43198: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853753.43202: Calling groups_plugins_play to load vars for managed_node2 30583 1726853753.45386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853753.47262: done with get_vars() 30583 1726853753.47296: done getting variables 30583 1726853753.47370: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:35:53 -0400 (0:00:00.079) 0:01:28.811 ****** 30583 1726853753.47413: entering _queue_task() for managed_node2/debug 30583 1726853753.47813: worker is 1 (out of 1 available) 30583 1726853753.47826: exiting _queue_task() for managed_node2/debug 30583 1726853753.47839: done queuing things up, now waiting for results queue to drain 30583 1726853753.47840: waiting for pending results... 30583 1726853753.48165: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853753.48409: in run() - task 02083763-bbaf-05ea-abc5-0000000019d1 30583 1726853753.48412: variable 'ansible_search_path' from source: unknown 30583 1726853753.48415: variable 'ansible_search_path' from source: unknown 30583 1726853753.48455: calling self._execute() 30583 1726853753.48627: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853753.48632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853753.48635: variable 'omit' from source: magic vars 30583 1726853753.49029: variable 'ansible_distribution_major_version' from source: facts 30583 1726853753.49048: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853753.49074: variable 'omit' from source: magic vars 30583 1726853753.49142: variable 'omit' from source: magic vars 30583 1726853753.49197: variable 'omit' from source: magic vars 30583 1726853753.49281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853753.49306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853753.49332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853753.49354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853753.49390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853753.49416: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853753.49498: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853753.49501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853753.49550: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853753.49565: Set connection var ansible_timeout to 10 30583 1726853753.49581: Set connection var ansible_connection to ssh 30583 1726853753.49592: Set connection var ansible_shell_executable to /bin/sh 30583 1726853753.49609: Set connection var ansible_shell_type to sh 30583 1726853753.49625: Set connection var ansible_pipelining to False 30583 1726853753.49652: variable 'ansible_shell_executable' from source: unknown 30583 1726853753.49663: variable 'ansible_connection' from source: unknown 30583 1726853753.49674: variable 'ansible_module_compression' from source: unknown 30583 1726853753.49681: variable 'ansible_shell_type' from source: unknown 30583 1726853753.49689: variable 'ansible_shell_executable' from source: unknown 30583 1726853753.49695: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853753.49703: variable 'ansible_pipelining' from source: unknown 30583 1726853753.49825: variable 'ansible_timeout' from source: unknown 30583 1726853753.49828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853753.49887: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853753.49907: variable 'omit' from source: magic vars 30583 1726853753.49917: starting attempt loop 30583 1726853753.49932: running the handler 30583 1726853753.50085: variable '__network_connections_result' from source: set_fact 30583 1726853753.50174: handler run complete 30583 1726853753.50197: attempt loop complete, returning result 30583 1726853753.50203: _execute() done 30583 1726853753.50209: dumping result to json 30583 1726853753.50214: done dumping result, returning 30583 1726853753.50268: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-0000000019d1] 30583 1726853753.50274: sending task result for task 02083763-bbaf-05ea-abc5-0000000019d1 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954 skipped because already active" ] } 30583 1726853753.50582: no more pending results, returning what we have 30583 1726853753.50587: results queue empty 30583 1726853753.50588: checking for any_errors_fatal 30583 1726853753.50600: done checking for any_errors_fatal 30583 1726853753.50601: checking for max_fail_percentage 30583 1726853753.50603: done checking for max_fail_percentage 30583 1726853753.50604: checking to see if all hosts have failed and the running result is not ok 30583 1726853753.50605: done checking to see if all hosts have failed 30583 1726853753.50606: getting the remaining hosts for this loop 30583 1726853753.50608: done getting the remaining hosts for this loop 30583 1726853753.50612: getting the next task for host managed_node2 30583 1726853753.50621: done getting next task for host managed_node2 30583 1726853753.50626: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853753.50636: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853753.50652: getting variables 30583 1726853753.50654: in VariableManager get_vars() 30583 1726853753.50813: Calling all_inventory to load vars for managed_node2 30583 1726853753.50816: Calling groups_inventory to load vars for managed_node2 30583 1726853753.50818: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853753.50899: Calling all_plugins_play to load vars for managed_node2 30583 1726853753.50903: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853753.50913: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019d1 30583 1726853753.50916: WORKER PROCESS EXITING 30583 1726853753.50921: Calling groups_plugins_play to load vars for managed_node2 30583 1726853753.52760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853753.54386: done with get_vars() 30583 1726853753.54412: done getting variables 30583 1726853753.54469: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:35:53 -0400 (0:00:00.071) 0:01:28.882 ****** 30583 1726853753.54517: entering _queue_task() for managed_node2/debug 30583 1726853753.54891: worker is 1 (out of 1 available) 30583 1726853753.54903: exiting _queue_task() for managed_node2/debug 30583 1726853753.55028: done queuing things up, now waiting for results queue to drain 30583 1726853753.55030: waiting for pending results... 30583 1726853753.55249: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853753.55573: in run() - task 02083763-bbaf-05ea-abc5-0000000019d2 30583 1726853753.55577: variable 'ansible_search_path' from source: unknown 30583 1726853753.55580: variable 'ansible_search_path' from source: unknown 30583 1726853753.55621: calling self._execute() 30583 1726853753.55978: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853753.55982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853753.55984: variable 'omit' from source: magic vars 30583 1726853753.56649: variable 'ansible_distribution_major_version' from source: facts 30583 1726853753.56712: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853753.56976: variable 'omit' from source: magic vars 30583 1726853753.56980: variable 'omit' from source: magic vars 30583 1726853753.56983: variable 'omit' from source: magic vars 30583 1726853753.57064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853753.57176: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853753.57268: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853753.57293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853753.57309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853753.57388: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853753.57574: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853753.57578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853753.57697: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853753.57708: Set connection var ansible_timeout to 10 30583 1726853753.57715: Set connection var ansible_connection to ssh 30583 1726853753.57724: Set connection var ansible_shell_executable to /bin/sh 30583 1726853753.57729: Set connection var ansible_shell_type to sh 30583 1726853753.57743: Set connection var ansible_pipelining to False 30583 1726853753.57779: variable 'ansible_shell_executable' from source: unknown 30583 1726853753.57978: variable 'ansible_connection' from source: unknown 30583 1726853753.57983: variable 'ansible_module_compression' from source: unknown 30583 1726853753.57985: variable 'ansible_shell_type' from source: unknown 30583 1726853753.57987: variable 'ansible_shell_executable' from source: unknown 30583 1726853753.57989: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853753.57991: variable 'ansible_pipelining' from source: unknown 30583 1726853753.57993: variable 'ansible_timeout' from source: unknown 30583 1726853753.57995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853753.58184: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853753.58376: variable 'omit' from source: magic vars 30583 1726853753.58379: starting attempt loop 30583 1726853753.58382: running the handler 30583 1726853753.58425: variable '__network_connections_result' from source: set_fact 30583 1726853753.58661: variable '__network_connections_result' from source: set_fact 30583 1726853753.58882: handler run complete 30583 1726853753.58914: attempt loop complete, returning result 30583 1726853753.58995: _execute() done 30583 1726853753.59002: dumping result to json 30583 1726853753.59010: done dumping result, returning 30583 1726853753.59023: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-0000000019d2] 30583 1726853753.59031: sending task result for task 02083763-bbaf-05ea-abc5-0000000019d2 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 5c3c483d-e950-47f9-9afb-d5e74f691954 skipped because already active" ] } } 30583 1726853753.59312: no more pending results, returning what we have 30583 1726853753.59316: results queue empty 30583 1726853753.59317: checking for any_errors_fatal 30583 1726853753.59324: done checking for any_errors_fatal 30583 1726853753.59325: checking for max_fail_percentage 30583 1726853753.59327: done checking for max_fail_percentage 30583 1726853753.59328: checking to see if all hosts have failed and the running result is not ok 30583 1726853753.59329: done checking to see if all hosts have failed 30583 1726853753.59330: getting the remaining hosts for this loop 30583 1726853753.59332: done getting the remaining hosts for this loop 30583 1726853753.59335: getting the next task for host managed_node2 30583 1726853753.59344: done getting next task for host managed_node2 30583 1726853753.59348: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853753.59354: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853753.59533: getting variables 30583 1726853753.59536: in VariableManager get_vars() 30583 1726853753.59586: Calling all_inventory to load vars for managed_node2 30583 1726853753.59589: Calling groups_inventory to load vars for managed_node2 30583 1726853753.59592: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853753.59761: Calling all_plugins_play to load vars for managed_node2 30583 1726853753.59773: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853753.59777: Calling groups_plugins_play to load vars for managed_node2 30583 1726853753.60579: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019d2 30583 1726853753.60582: WORKER PROCESS EXITING 30583 1726853753.61851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853753.64342: done with get_vars() 30583 1726853753.64478: done getting variables 30583 1726853753.64549: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:35:53 -0400 (0:00:00.100) 0:01:28.983 ****** 30583 1726853753.64590: entering _queue_task() for managed_node2/debug 30583 1726853753.65603: worker is 1 (out of 1 available) 30583 1726853753.65616: exiting _queue_task() for managed_node2/debug 30583 1726853753.65627: done queuing things up, now waiting for results queue to drain 30583 1726853753.65629: waiting for pending results... 30583 1726853753.66076: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853753.66421: in run() - task 02083763-bbaf-05ea-abc5-0000000019d3 30583 1726853753.66493: variable 'ansible_search_path' from source: unknown 30583 1726853753.66501: variable 'ansible_search_path' from source: unknown 30583 1726853753.66542: calling self._execute() 30583 1726853753.66977: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853753.66981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853753.66983: variable 'omit' from source: magic vars 30583 1726853753.67782: variable 'ansible_distribution_major_version' from source: facts 30583 1726853753.67786: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853753.67968: variable 'network_state' from source: role '' defaults 30583 1726853753.68010: Evaluated conditional (network_state != {}): False 30583 1726853753.68083: when evaluation is False, skipping this task 30583 1726853753.68092: _execute() done 30583 1726853753.68109: dumping result to json 30583 1726853753.68118: done dumping result, returning 30583 1726853753.68132: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-0000000019d3] 30583 1726853753.68142: sending task result for task 02083763-bbaf-05ea-abc5-0000000019d3 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853753.68373: no more pending results, returning what we have 30583 1726853753.68378: results queue empty 30583 1726853753.68379: checking for any_errors_fatal 30583 1726853753.68390: done checking for any_errors_fatal 30583 1726853753.68390: checking for max_fail_percentage 30583 1726853753.68393: done checking for max_fail_percentage 30583 1726853753.68394: checking to see if all hosts have failed and the running result is not ok 30583 1726853753.68395: done checking to see if all hosts have failed 30583 1726853753.68395: getting the remaining hosts for this loop 30583 1726853753.68397: done getting the remaining hosts for this loop 30583 1726853753.68401: getting the next task for host managed_node2 30583 1726853753.68410: done getting next task for host managed_node2 30583 1726853753.68415: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853753.68422: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853753.68575: getting variables 30583 1726853753.68577: in VariableManager get_vars() 30583 1726853753.68622: Calling all_inventory to load vars for managed_node2 30583 1726853753.68625: Calling groups_inventory to load vars for managed_node2 30583 1726853753.68628: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853753.68874: Calling all_plugins_play to load vars for managed_node2 30583 1726853753.68879: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853753.68892: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019d3 30583 1726853753.68895: WORKER PROCESS EXITING 30583 1726853753.68899: Calling groups_plugins_play to load vars for managed_node2 30583 1726853753.72552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853753.76064: done with get_vars() 30583 1726853753.76211: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:35:53 -0400 (0:00:00.118) 0:01:29.101 ****** 30583 1726853753.76438: entering _queue_task() for managed_node2/ping 30583 1726853753.77166: worker is 1 (out of 1 available) 30583 1726853753.77181: exiting _queue_task() for managed_node2/ping 30583 1726853753.77583: done queuing things up, now waiting for results queue to drain 30583 1726853753.77585: waiting for pending results... 30583 1726853753.78004: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853753.78575: in run() - task 02083763-bbaf-05ea-abc5-0000000019d4 30583 1726853753.78597: variable 'ansible_search_path' from source: unknown 30583 1726853753.78608: variable 'ansible_search_path' from source: unknown 30583 1726853753.78689: calling self._execute() 30583 1726853753.79297: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853753.79300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853753.79303: variable 'omit' from source: magic vars 30583 1726853753.79970: variable 'ansible_distribution_major_version' from source: facts 30583 1726853753.79989: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853753.80073: variable 'omit' from source: magic vars 30583 1726853753.80138: variable 'omit' from source: magic vars 30583 1726853753.80206: variable 'omit' from source: magic vars 30583 1726853753.80322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853753.80423: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853753.80448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853753.80610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853753.80713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853753.80717: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853753.80719: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853753.80722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853753.80895: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853753.80907: Set connection var ansible_timeout to 10 30583 1726853753.80914: Set connection var ansible_connection to ssh 30583 1726853753.80923: Set connection var ansible_shell_executable to /bin/sh 30583 1726853753.80952: Set connection var ansible_shell_type to sh 30583 1726853753.80969: Set connection var ansible_pipelining to False 30583 1726853753.81168: variable 'ansible_shell_executable' from source: unknown 30583 1726853753.81173: variable 'ansible_connection' from source: unknown 30583 1726853753.81175: variable 'ansible_module_compression' from source: unknown 30583 1726853753.81177: variable 'ansible_shell_type' from source: unknown 30583 1726853753.81179: variable 'ansible_shell_executable' from source: unknown 30583 1726853753.81181: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853753.81183: variable 'ansible_pipelining' from source: unknown 30583 1726853753.81184: variable 'ansible_timeout' from source: unknown 30583 1726853753.81186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853753.81537: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853753.81554: variable 'omit' from source: magic vars 30583 1726853753.81566: starting attempt loop 30583 1726853753.81574: running the handler 30583 1726853753.81618: _low_level_execute_command(): starting 30583 1726853753.81629: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853753.83182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853753.83255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853753.83277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853753.83387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853753.83493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853753.83511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853753.83593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853753.84085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853753.85854: stdout chunk (state=3): >>>/root <<< 30583 1726853753.85947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853753.86009: stderr chunk (state=3): >>><<< 30583 1726853753.86049: stdout chunk (state=3): >>><<< 30583 1726853753.86264: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853753.86268: _low_level_execute_command(): starting 30583 1726853753.86273: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365 `" && echo ansible-tmp-1726853753.8617435-34768-198214357102365="` echo /root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365 `" ) && sleep 0' 30583 1726853753.87515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853753.87532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853753.87598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853753.87757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853753.87778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853753.87900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853753.89982: stdout chunk (state=3): >>>ansible-tmp-1726853753.8617435-34768-198214357102365=/root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365 <<< 30583 1726853753.90132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853753.90216: stderr chunk (state=3): >>><<< 30583 1726853753.90232: stdout chunk (state=3): >>><<< 30583 1726853753.90448: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853753.8617435-34768-198214357102365=/root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853753.90452: variable 'ansible_module_compression' from source: unknown 30583 1726853753.90780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853753.90783: variable 'ansible_facts' from source: unknown 30583 1726853753.91018: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365/AnsiballZ_ping.py 30583 1726853753.91532: Sending initial data 30583 1726853753.91541: Sent initial data (153 bytes) 30583 1726853753.92719: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853753.92738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853753.92822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853753.92892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853753.93018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853753.93105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853753.93188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853753.94933: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853753.95174: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853753.95577: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp_7d6regk /root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365/AnsiballZ_ping.py <<< 30583 1726853753.95581: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365/AnsiballZ_ping.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp_7d6regk" to remote "/root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365/AnsiballZ_ping.py" <<< 30583 1726853753.97356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853753.97557: stderr chunk (state=3): >>><<< 30583 1726853753.97564: stdout chunk (state=3): >>><<< 30583 1726853753.97566: done transferring module to remote 30583 1726853753.97568: _low_level_execute_command(): starting 30583 1726853753.97572: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365/ /root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365/AnsiballZ_ping.py && sleep 0' 30583 1726853753.98847: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853753.99015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853753.99130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853753.99149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853753.99188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853753.99610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853754.01764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853754.01768: stdout chunk (state=3): >>><<< 30583 1726853754.01770: stderr chunk (state=3): >>><<< 30583 1726853754.01795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853754.01968: _low_level_execute_command(): starting 30583 1726853754.01975: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365/AnsiballZ_ping.py && sleep 0' 30583 1726853754.03503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853754.03551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853754.03611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853754.03688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853754.03802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853754.19340: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853754.20738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853754.20787: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853754.20877: stderr chunk (state=3): >>><<< 30583 1726853754.20886: stdout chunk (state=3): >>><<< 30583 1726853754.20907: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853754.20935: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853754.21149: _low_level_execute_command(): starting 30583 1726853754.21152: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853753.8617435-34768-198214357102365/ > /dev/null 2>&1 && sleep 0' 30583 1726853754.22172: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853754.22286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853754.22403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853754.22468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853754.22593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853754.22706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853754.24647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853754.24657: stdout chunk (state=3): >>><<< 30583 1726853754.24673: stderr chunk (state=3): >>><<< 30583 1726853754.24692: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853754.24708: handler run complete 30583 1726853754.24727: attempt loop complete, returning result 30583 1726853754.24763: _execute() done 30583 1726853754.24773: dumping result to json 30583 1726853754.24978: done dumping result, returning 30583 1726853754.24981: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-0000000019d4] 30583 1726853754.24983: sending task result for task 02083763-bbaf-05ea-abc5-0000000019d4 30583 1726853754.25052: done sending task result for task 02083763-bbaf-05ea-abc5-0000000019d4 30583 1726853754.25055: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853754.25150: no more pending results, returning what we have 30583 1726853754.25154: results queue empty 30583 1726853754.25155: checking for any_errors_fatal 30583 1726853754.25164: done checking for any_errors_fatal 30583 1726853754.25166: checking for max_fail_percentage 30583 1726853754.25168: done checking for max_fail_percentage 30583 1726853754.25169: checking to see if all hosts have failed and the running result is not ok 30583 1726853754.25169: done checking to see if all hosts have failed 30583 1726853754.25170: getting the remaining hosts for this loop 30583 1726853754.25174: done getting the remaining hosts for this loop 30583 1726853754.25178: getting the next task for host managed_node2 30583 1726853754.25188: done getting next task for host managed_node2 30583 1726853754.25191: ^ task is: TASK: meta (role_complete) 30583 1726853754.25197: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853754.25211: getting variables 30583 1726853754.25213: in VariableManager get_vars() 30583 1726853754.25263: Calling all_inventory to load vars for managed_node2 30583 1726853754.25267: Calling groups_inventory to load vars for managed_node2 30583 1726853754.25269: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853754.25484: Calling all_plugins_play to load vars for managed_node2 30583 1726853754.25602: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853754.25607: Calling groups_plugins_play to load vars for managed_node2 30583 1726853754.28740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853754.32417: done with get_vars() 30583 1726853754.32594: done getting variables 30583 1726853754.32680: done queuing things up, now waiting for results queue to drain 30583 1726853754.32682: results queue empty 30583 1726853754.32683: checking for any_errors_fatal 30583 1726853754.32686: done checking for any_errors_fatal 30583 1726853754.32687: checking for max_fail_percentage 30583 1726853754.32688: done checking for max_fail_percentage 30583 1726853754.32776: checking to see if all hosts have failed and the running result is not ok 30583 1726853754.32776: done checking to see if all hosts have failed 30583 1726853754.32777: getting the remaining hosts for this loop 30583 1726853754.32778: done getting the remaining hosts for this loop 30583 1726853754.32781: getting the next task for host managed_node2 30583 1726853754.32789: done getting next task for host managed_node2 30583 1726853754.32792: ^ task is: TASK: Include network role 30583 1726853754.32794: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853754.32797: getting variables 30583 1726853754.32975: in VariableManager get_vars() 30583 1726853754.32992: Calling all_inventory to load vars for managed_node2 30583 1726853754.32995: Calling groups_inventory to load vars for managed_node2 30583 1726853754.32998: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853754.33003: Calling all_plugins_play to load vars for managed_node2 30583 1726853754.33005: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853754.33008: Calling groups_plugins_play to load vars for managed_node2 30583 1726853754.35410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853754.39635: done with get_vars() 30583 1726853754.39669: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Friday 20 September 2024 13:35:54 -0400 (0:00:00.634) 0:01:29.735 ****** 30583 1726853754.39978: entering _queue_task() for managed_node2/include_role 30583 1726853754.40810: worker is 1 (out of 1 available) 30583 1726853754.40822: exiting _queue_task() for managed_node2/include_role 30583 1726853754.40835: done queuing things up, now waiting for results queue to drain 30583 1726853754.40837: waiting for pending results... 30583 1726853754.41568: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853754.42178: in run() - task 02083763-bbaf-05ea-abc5-0000000017d9 30583 1726853754.42182: variable 'ansible_search_path' from source: unknown 30583 1726853754.42185: variable 'ansible_search_path' from source: unknown 30583 1726853754.42190: calling self._execute() 30583 1726853754.42272: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853754.42514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853754.42560: variable 'omit' from source: magic vars 30583 1726853754.43132: variable 'ansible_distribution_major_version' from source: facts 30583 1726853754.43143: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853754.43274: _execute() done 30583 1726853754.43278: dumping result to json 30583 1726853754.43280: done dumping result, returning 30583 1726853754.43284: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-0000000017d9] 30583 1726853754.43286: sending task result for task 02083763-bbaf-05ea-abc5-0000000017d9 30583 1726853754.43458: no more pending results, returning what we have 30583 1726853754.43464: in VariableManager get_vars() 30583 1726853754.43518: Calling all_inventory to load vars for managed_node2 30583 1726853754.43522: Calling groups_inventory to load vars for managed_node2 30583 1726853754.43526: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853754.43542: Calling all_plugins_play to load vars for managed_node2 30583 1726853754.43546: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853754.43550: Calling groups_plugins_play to load vars for managed_node2 30583 1726853754.44356: done sending task result for task 02083763-bbaf-05ea-abc5-0000000017d9 30583 1726853754.44359: WORKER PROCESS EXITING 30583 1726853754.46356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853754.48554: done with get_vars() 30583 1726853754.48640: variable 'ansible_search_path' from source: unknown 30583 1726853754.48641: variable 'ansible_search_path' from source: unknown 30583 1726853754.48934: variable 'omit' from source: magic vars 30583 1726853754.49020: variable 'omit' from source: magic vars 30583 1726853754.49036: variable 'omit' from source: magic vars 30583 1726853754.49040: we have included files to process 30583 1726853754.49041: generating all_blocks data 30583 1726853754.49043: done generating all_blocks data 30583 1726853754.49049: processing included file: fedora.linux_system_roles.network 30583 1726853754.49068: in VariableManager get_vars() 30583 1726853754.49087: done with get_vars() 30583 1726853754.49130: in VariableManager get_vars() 30583 1726853754.49149: done with get_vars() 30583 1726853754.49197: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853754.49327: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853754.49414: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853754.49925: in VariableManager get_vars() 30583 1726853754.49945: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853754.53021: iterating over new_blocks loaded from include file 30583 1726853754.53023: in VariableManager get_vars() 30583 1726853754.53042: done with get_vars() 30583 1726853754.53044: filtering new block on tags 30583 1726853754.53348: done filtering new block on tags 30583 1726853754.53352: in VariableManager get_vars() 30583 1726853754.53366: done with get_vars() 30583 1726853754.53368: filtering new block on tags 30583 1726853754.53384: done filtering new block on tags 30583 1726853754.53386: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853754.53391: extending task lists for all hosts with included blocks 30583 1726853754.53518: done extending task lists 30583 1726853754.53519: done processing included files 30583 1726853754.53521: results queue empty 30583 1726853754.53522: checking for any_errors_fatal 30583 1726853754.53524: done checking for any_errors_fatal 30583 1726853754.53524: checking for max_fail_percentage 30583 1726853754.53525: done checking for max_fail_percentage 30583 1726853754.53529: checking to see if all hosts have failed and the running result is not ok 30583 1726853754.53530: done checking to see if all hosts have failed 30583 1726853754.53531: getting the remaining hosts for this loop 30583 1726853754.53532: done getting the remaining hosts for this loop 30583 1726853754.53535: getting the next task for host managed_node2 30583 1726853754.53540: done getting next task for host managed_node2 30583 1726853754.53543: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853754.53546: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853754.53563: getting variables 30583 1726853754.53564: in VariableManager get_vars() 30583 1726853754.53579: Calling all_inventory to load vars for managed_node2 30583 1726853754.53581: Calling groups_inventory to load vars for managed_node2 30583 1726853754.53583: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853754.53589: Calling all_plugins_play to load vars for managed_node2 30583 1726853754.53591: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853754.53594: Calling groups_plugins_play to load vars for managed_node2 30583 1726853754.54798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853754.56914: done with get_vars() 30583 1726853754.56941: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:35:54 -0400 (0:00:00.173) 0:01:29.909 ****** 30583 1726853754.57251: entering _queue_task() for managed_node2/include_tasks 30583 1726853754.58111: worker is 1 (out of 1 available) 30583 1726853754.58129: exiting _queue_task() for managed_node2/include_tasks 30583 1726853754.58299: done queuing things up, now waiting for results queue to drain 30583 1726853754.58300: waiting for pending results... 30583 1726853754.58507: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853754.58694: in run() - task 02083763-bbaf-05ea-abc5-000000001b3b 30583 1726853754.58720: variable 'ansible_search_path' from source: unknown 30583 1726853754.58728: variable 'ansible_search_path' from source: unknown 30583 1726853754.58782: calling self._execute() 30583 1726853754.58894: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853754.58905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853754.58926: variable 'omit' from source: magic vars 30583 1726853754.59331: variable 'ansible_distribution_major_version' from source: facts 30583 1726853754.59349: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853754.59366: _execute() done 30583 1726853754.59376: dumping result to json 30583 1726853754.59384: done dumping result, returning 30583 1726853754.59406: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-000000001b3b] 30583 1726853754.59416: sending task result for task 02083763-bbaf-05ea-abc5-000000001b3b 30583 1726853754.59635: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b3b 30583 1726853754.59638: WORKER PROCESS EXITING 30583 1726853754.59705: no more pending results, returning what we have 30583 1726853754.59712: in VariableManager get_vars() 30583 1726853754.59767: Calling all_inventory to load vars for managed_node2 30583 1726853754.59773: Calling groups_inventory to load vars for managed_node2 30583 1726853754.59776: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853754.59907: Calling all_plugins_play to load vars for managed_node2 30583 1726853754.59913: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853754.59918: Calling groups_plugins_play to load vars for managed_node2 30583 1726853754.61563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853754.64387: done with get_vars() 30583 1726853754.64420: variable 'ansible_search_path' from source: unknown 30583 1726853754.64422: variable 'ansible_search_path' from source: unknown 30583 1726853754.64461: we have included files to process 30583 1726853754.64462: generating all_blocks data 30583 1726853754.64464: done generating all_blocks data 30583 1726853754.64469: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853754.64472: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853754.64478: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853754.65175: done processing included file 30583 1726853754.65177: iterating over new_blocks loaded from include file 30583 1726853754.65179: in VariableManager get_vars() 30583 1726853754.65209: done with get_vars() 30583 1726853754.65211: filtering new block on tags 30583 1726853754.65244: done filtering new block on tags 30583 1726853754.65248: in VariableManager get_vars() 30583 1726853754.65284: done with get_vars() 30583 1726853754.65287: filtering new block on tags 30583 1726853754.65331: done filtering new block on tags 30583 1726853754.65334: in VariableManager get_vars() 30583 1726853754.65357: done with get_vars() 30583 1726853754.65361: filtering new block on tags 30583 1726853754.65411: done filtering new block on tags 30583 1726853754.65414: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853754.65419: extending task lists for all hosts with included blocks 30583 1726853754.68405: done extending task lists 30583 1726853754.68406: done processing included files 30583 1726853754.68407: results queue empty 30583 1726853754.68408: checking for any_errors_fatal 30583 1726853754.68412: done checking for any_errors_fatal 30583 1726853754.68413: checking for max_fail_percentage 30583 1726853754.68414: done checking for max_fail_percentage 30583 1726853754.68415: checking to see if all hosts have failed and the running result is not ok 30583 1726853754.68416: done checking to see if all hosts have failed 30583 1726853754.68417: getting the remaining hosts for this loop 30583 1726853754.68418: done getting the remaining hosts for this loop 30583 1726853754.68425: getting the next task for host managed_node2 30583 1726853754.68431: done getting next task for host managed_node2 30583 1726853754.68434: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853754.68438: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853754.68452: getting variables 30583 1726853754.68453: in VariableManager get_vars() 30583 1726853754.68476: Calling all_inventory to load vars for managed_node2 30583 1726853754.68479: Calling groups_inventory to load vars for managed_node2 30583 1726853754.68481: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853754.68487: Calling all_plugins_play to load vars for managed_node2 30583 1726853754.68489: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853754.68492: Calling groups_plugins_play to load vars for managed_node2 30583 1726853754.69774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853754.71441: done with get_vars() 30583 1726853754.71475: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:35:54 -0400 (0:00:00.143) 0:01:30.052 ****** 30583 1726853754.71564: entering _queue_task() for managed_node2/setup 30583 1726853754.72175: worker is 1 (out of 1 available) 30583 1726853754.72185: exiting _queue_task() for managed_node2/setup 30583 1726853754.72196: done queuing things up, now waiting for results queue to drain 30583 1726853754.72197: waiting for pending results... 30583 1726853754.72390: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853754.72507: in run() - task 02083763-bbaf-05ea-abc5-000000001b92 30583 1726853754.72530: variable 'ansible_search_path' from source: unknown 30583 1726853754.72539: variable 'ansible_search_path' from source: unknown 30583 1726853754.72582: calling self._execute() 30583 1726853754.72702: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853754.72705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853754.72747: variable 'omit' from source: magic vars 30583 1726853754.73153: variable 'ansible_distribution_major_version' from source: facts 30583 1726853754.73175: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853754.73431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853754.75876: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853754.75924: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853754.75967: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853754.76026: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853754.76060: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853754.76161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853754.76222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853754.76238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853754.76339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853754.76343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853754.76384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853754.76415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853754.76456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853754.76504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853754.76552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853754.76712: variable '__network_required_facts' from source: role '' defaults 30583 1726853754.76727: variable 'ansible_facts' from source: unknown 30583 1726853754.77553: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853754.77638: when evaluation is False, skipping this task 30583 1726853754.77642: _execute() done 30583 1726853754.77644: dumping result to json 30583 1726853754.77647: done dumping result, returning 30583 1726853754.77650: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-000000001b92] 30583 1726853754.77652: sending task result for task 02083763-bbaf-05ea-abc5-000000001b92 30583 1726853754.77727: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b92 30583 1726853754.77731: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853754.77798: no more pending results, returning what we have 30583 1726853754.77802: results queue empty 30583 1726853754.77803: checking for any_errors_fatal 30583 1726853754.77804: done checking for any_errors_fatal 30583 1726853754.77805: checking for max_fail_percentage 30583 1726853754.77807: done checking for max_fail_percentage 30583 1726853754.77808: checking to see if all hosts have failed and the running result is not ok 30583 1726853754.77809: done checking to see if all hosts have failed 30583 1726853754.77810: getting the remaining hosts for this loop 30583 1726853754.77812: done getting the remaining hosts for this loop 30583 1726853754.77818: getting the next task for host managed_node2 30583 1726853754.77830: done getting next task for host managed_node2 30583 1726853754.77834: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853754.77842: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853754.77988: getting variables 30583 1726853754.77995: in VariableManager get_vars() 30583 1726853754.78043: Calling all_inventory to load vars for managed_node2 30583 1726853754.78046: Calling groups_inventory to load vars for managed_node2 30583 1726853754.78049: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853754.78061: Calling all_plugins_play to load vars for managed_node2 30583 1726853754.78065: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853754.78182: Calling groups_plugins_play to load vars for managed_node2 30583 1726853754.80026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853754.81770: done with get_vars() 30583 1726853754.81800: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:35:54 -0400 (0:00:00.103) 0:01:30.156 ****** 30583 1726853754.81913: entering _queue_task() for managed_node2/stat 30583 1726853754.82485: worker is 1 (out of 1 available) 30583 1726853754.82496: exiting _queue_task() for managed_node2/stat 30583 1726853754.82507: done queuing things up, now waiting for results queue to drain 30583 1726853754.82508: waiting for pending results... 30583 1726853754.82693: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853754.82955: in run() - task 02083763-bbaf-05ea-abc5-000000001b94 30583 1726853754.82962: variable 'ansible_search_path' from source: unknown 30583 1726853754.82965: variable 'ansible_search_path' from source: unknown 30583 1726853754.82968: calling self._execute() 30583 1726853754.83036: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853754.83049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853754.83076: variable 'omit' from source: magic vars 30583 1726853754.83479: variable 'ansible_distribution_major_version' from source: facts 30583 1726853754.83503: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853754.83688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853754.83969: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853754.84017: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853754.84057: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853754.84096: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853754.84257: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853754.84262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853754.84265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853754.84267: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853754.84347: variable '__network_is_ostree' from source: set_fact 30583 1726853754.84366: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853754.84380: when evaluation is False, skipping this task 30583 1726853754.84388: _execute() done 30583 1726853754.84395: dumping result to json 30583 1726853754.84403: done dumping result, returning 30583 1726853754.84414: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-000000001b94] 30583 1726853754.84424: sending task result for task 02083763-bbaf-05ea-abc5-000000001b94 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853754.84693: no more pending results, returning what we have 30583 1726853754.84697: results queue empty 30583 1726853754.84698: checking for any_errors_fatal 30583 1726853754.84708: done checking for any_errors_fatal 30583 1726853754.84709: checking for max_fail_percentage 30583 1726853754.84711: done checking for max_fail_percentage 30583 1726853754.84712: checking to see if all hosts have failed and the running result is not ok 30583 1726853754.84713: done checking to see if all hosts have failed 30583 1726853754.84714: getting the remaining hosts for this loop 30583 1726853754.84716: done getting the remaining hosts for this loop 30583 1726853754.84720: getting the next task for host managed_node2 30583 1726853754.84729: done getting next task for host managed_node2 30583 1726853754.84733: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853754.84741: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853754.84976: getting variables 30583 1726853754.84978: in VariableManager get_vars() 30583 1726853754.85016: Calling all_inventory to load vars for managed_node2 30583 1726853754.85019: Calling groups_inventory to load vars for managed_node2 30583 1726853754.85021: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853754.85030: Calling all_plugins_play to load vars for managed_node2 30583 1726853754.85033: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853754.85037: Calling groups_plugins_play to load vars for managed_node2 30583 1726853754.85879: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b94 30583 1726853754.85883: WORKER PROCESS EXITING 30583 1726853754.87040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853754.89308: done with get_vars() 30583 1726853754.89337: done getting variables 30583 1726853754.89510: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:35:54 -0400 (0:00:00.076) 0:01:30.232 ****** 30583 1726853754.89551: entering _queue_task() for managed_node2/set_fact 30583 1726853754.90361: worker is 1 (out of 1 available) 30583 1726853754.90595: exiting _queue_task() for managed_node2/set_fact 30583 1726853754.90609: done queuing things up, now waiting for results queue to drain 30583 1726853754.90610: waiting for pending results... 30583 1726853754.90737: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853754.90931: in run() - task 02083763-bbaf-05ea-abc5-000000001b95 30583 1726853754.90954: variable 'ansible_search_path' from source: unknown 30583 1726853754.90975: variable 'ansible_search_path' from source: unknown 30583 1726853754.91020: calling self._execute() 30583 1726853754.91132: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853754.91145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853754.91167: variable 'omit' from source: magic vars 30583 1726853754.91600: variable 'ansible_distribution_major_version' from source: facts 30583 1726853754.91611: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853754.91768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853754.91977: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853754.92007: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853754.92031: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853754.92057: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853754.92123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853754.92140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853754.92162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853754.92183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853754.92478: variable '__network_is_ostree' from source: set_fact 30583 1726853754.92481: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853754.92483: when evaluation is False, skipping this task 30583 1726853754.92485: _execute() done 30583 1726853754.92487: dumping result to json 30583 1726853754.92489: done dumping result, returning 30583 1726853754.92491: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-000000001b95] 30583 1726853754.92492: sending task result for task 02083763-bbaf-05ea-abc5-000000001b95 30583 1726853754.92552: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b95 30583 1726853754.92556: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853754.92610: no more pending results, returning what we have 30583 1726853754.92614: results queue empty 30583 1726853754.92615: checking for any_errors_fatal 30583 1726853754.92622: done checking for any_errors_fatal 30583 1726853754.92623: checking for max_fail_percentage 30583 1726853754.92625: done checking for max_fail_percentage 30583 1726853754.92626: checking to see if all hosts have failed and the running result is not ok 30583 1726853754.92627: done checking to see if all hosts have failed 30583 1726853754.92627: getting the remaining hosts for this loop 30583 1726853754.92629: done getting the remaining hosts for this loop 30583 1726853754.92633: getting the next task for host managed_node2 30583 1726853754.92645: done getting next task for host managed_node2 30583 1726853754.92649: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853754.92654: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853754.92890: getting variables 30583 1726853754.92892: in VariableManager get_vars() 30583 1726853754.92927: Calling all_inventory to load vars for managed_node2 30583 1726853754.92929: Calling groups_inventory to load vars for managed_node2 30583 1726853754.92931: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853754.92940: Calling all_plugins_play to load vars for managed_node2 30583 1726853754.92942: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853754.92945: Calling groups_plugins_play to load vars for managed_node2 30583 1726853754.96151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853755.00081: done with get_vars() 30583 1726853755.00109: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:35:55 -0400 (0:00:00.108) 0:01:30.341 ****** 30583 1726853755.00409: entering _queue_task() for managed_node2/service_facts 30583 1726853755.01246: worker is 1 (out of 1 available) 30583 1726853755.01261: exiting _queue_task() for managed_node2/service_facts 30583 1726853755.01278: done queuing things up, now waiting for results queue to drain 30583 1726853755.01279: waiting for pending results... 30583 1726853755.02117: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853755.02214: in run() - task 02083763-bbaf-05ea-abc5-000000001b97 30583 1726853755.02225: variable 'ansible_search_path' from source: unknown 30583 1726853755.02228: variable 'ansible_search_path' from source: unknown 30583 1726853755.02266: calling self._execute() 30583 1726853755.02381: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853755.02384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853755.02689: variable 'omit' from source: magic vars 30583 1726853755.03194: variable 'ansible_distribution_major_version' from source: facts 30583 1726853755.03423: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853755.03430: variable 'omit' from source: magic vars 30583 1726853755.03519: variable 'omit' from source: magic vars 30583 1726853755.03542: variable 'omit' from source: magic vars 30583 1726853755.03764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853755.03799: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853755.03819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853755.03850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853755.03854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853755.04004: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853755.04008: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853755.04013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853755.04223: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853755.04288: Set connection var ansible_timeout to 10 30583 1726853755.04292: Set connection var ansible_connection to ssh 30583 1726853755.04294: Set connection var ansible_shell_executable to /bin/sh 30583 1726853755.04297: Set connection var ansible_shell_type to sh 30583 1726853755.04299: Set connection var ansible_pipelining to False 30583 1726853755.04300: variable 'ansible_shell_executable' from source: unknown 30583 1726853755.04302: variable 'ansible_connection' from source: unknown 30583 1726853755.04305: variable 'ansible_module_compression' from source: unknown 30583 1726853755.04307: variable 'ansible_shell_type' from source: unknown 30583 1726853755.04309: variable 'ansible_shell_executable' from source: unknown 30583 1726853755.04311: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853755.04312: variable 'ansible_pipelining' from source: unknown 30583 1726853755.04314: variable 'ansible_timeout' from source: unknown 30583 1726853755.04316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853755.04763: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853755.04768: variable 'omit' from source: magic vars 30583 1726853755.04772: starting attempt loop 30583 1726853755.04775: running the handler 30583 1726853755.04777: _low_level_execute_command(): starting 30583 1726853755.04779: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853755.06120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853755.06125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853755.06142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853755.06147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853755.06250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853755.08005: stdout chunk (state=3): >>>/root <<< 30583 1726853755.08131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853755.08137: stdout chunk (state=3): >>><<< 30583 1726853755.08145: stderr chunk (state=3): >>><<< 30583 1726853755.08164: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853755.08179: _low_level_execute_command(): starting 30583 1726853755.08185: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427 `" && echo ansible-tmp-1726853755.0816414-34813-33144784586427="` echo /root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427 `" ) && sleep 0' 30583 1726853755.08824: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853755.08828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853755.08844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853755.08850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853755.08867: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853755.08881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853755.08926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853755.08929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853755.09001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853755.09009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853755.09142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853755.11132: stdout chunk (state=3): >>>ansible-tmp-1726853755.0816414-34813-33144784586427=/root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427 <<< 30583 1726853755.11380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853755.11384: stdout chunk (state=3): >>><<< 30583 1726853755.11386: stderr chunk (state=3): >>><<< 30583 1726853755.11389: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853755.0816414-34813-33144784586427=/root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853755.11391: variable 'ansible_module_compression' from source: unknown 30583 1726853755.11408: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853755.11444: variable 'ansible_facts' from source: unknown 30583 1726853755.11547: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427/AnsiballZ_service_facts.py 30583 1726853755.11715: Sending initial data 30583 1726853755.11718: Sent initial data (161 bytes) 30583 1726853755.12368: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853755.12432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853755.12442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853755.12544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853755.14225: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30583 1726853755.14229: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853755.14288: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853755.14356: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpp7ijmory /root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427/AnsiballZ_service_facts.py <<< 30583 1726853755.14375: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427/AnsiballZ_service_facts.py" <<< 30583 1726853755.14535: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpp7ijmory" to remote "/root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427/AnsiballZ_service_facts.py" <<< 30583 1726853755.15675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853755.15837: stderr chunk (state=3): >>><<< 30583 1726853755.15841: stdout chunk (state=3): >>><<< 30583 1726853755.15847: done transferring module to remote 30583 1726853755.15850: _low_level_execute_command(): starting 30583 1726853755.15852: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427/ /root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427/AnsiballZ_service_facts.py && sleep 0' 30583 1726853755.16721: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853755.16725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853755.16742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853755.16825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853755.16843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853755.16921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853755.19273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853755.19277: stdout chunk (state=3): >>><<< 30583 1726853755.19280: stderr chunk (state=3): >>><<< 30583 1726853755.19282: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853755.19284: _low_level_execute_command(): starting 30583 1726853755.19286: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427/AnsiballZ_service_facts.py && sleep 0' 30583 1726853755.20212: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853755.20216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853755.20228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853755.20250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853755.20254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853755.20318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853755.20321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853755.20397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853756.86053: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 30583 1726853756.86117: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853756.87731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853756.87740: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853756.87789: stderr chunk (state=3): >>><<< 30583 1726853756.87797: stdout chunk (state=3): >>><<< 30583 1726853756.87819: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853756.88297: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853756.88313: _low_level_execute_command(): starting 30583 1726853756.88317: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853755.0816414-34813-33144784586427/ > /dev/null 2>&1 && sleep 0' 30583 1726853756.88767: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853756.88770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853756.88775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853756.88778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853756.88780: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853756.88819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853756.88822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853756.88827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853756.88898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853756.90824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853756.90850: stderr chunk (state=3): >>><<< 30583 1726853756.90853: stdout chunk (state=3): >>><<< 30583 1726853756.90865: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853756.90872: handler run complete 30583 1726853756.90989: variable 'ansible_facts' from source: unknown 30583 1726853756.91083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853756.91364: variable 'ansible_facts' from source: unknown 30583 1726853756.91445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853756.91562: attempt loop complete, returning result 30583 1726853756.91566: _execute() done 30583 1726853756.91568: dumping result to json 30583 1726853756.91608: done dumping result, returning 30583 1726853756.91616: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-000000001b97] 30583 1726853756.91621: sending task result for task 02083763-bbaf-05ea-abc5-000000001b97 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853756.92242: no more pending results, returning what we have 30583 1726853756.92245: results queue empty 30583 1726853756.92245: checking for any_errors_fatal 30583 1726853756.92250: done checking for any_errors_fatal 30583 1726853756.92250: checking for max_fail_percentage 30583 1726853756.92252: done checking for max_fail_percentage 30583 1726853756.92253: checking to see if all hosts have failed and the running result is not ok 30583 1726853756.92253: done checking to see if all hosts have failed 30583 1726853756.92254: getting the remaining hosts for this loop 30583 1726853756.92255: done getting the remaining hosts for this loop 30583 1726853756.92261: getting the next task for host managed_node2 30583 1726853756.92267: done getting next task for host managed_node2 30583 1726853756.92270: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853756.92280: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853756.92289: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b97 30583 1726853756.92292: WORKER PROCESS EXITING 30583 1726853756.92299: getting variables 30583 1726853756.92300: in VariableManager get_vars() 30583 1726853756.92325: Calling all_inventory to load vars for managed_node2 30583 1726853756.92327: Calling groups_inventory to load vars for managed_node2 30583 1726853756.92328: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853756.92334: Calling all_plugins_play to load vars for managed_node2 30583 1726853756.92336: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853756.92341: Calling groups_plugins_play to load vars for managed_node2 30583 1726853756.93140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853757.07949: done with get_vars() 30583 1726853757.07992: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:35:57 -0400 (0:00:02.077) 0:01:32.418 ****** 30583 1726853757.08162: entering _queue_task() for managed_node2/package_facts 30583 1726853757.08697: worker is 1 (out of 1 available) 30583 1726853757.08710: exiting _queue_task() for managed_node2/package_facts 30583 1726853757.08723: done queuing things up, now waiting for results queue to drain 30583 1726853757.08725: waiting for pending results... 30583 1726853757.09093: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853757.09237: in run() - task 02083763-bbaf-05ea-abc5-000000001b98 30583 1726853757.09249: variable 'ansible_search_path' from source: unknown 30583 1726853757.09253: variable 'ansible_search_path' from source: unknown 30583 1726853757.09382: calling self._execute() 30583 1726853757.09434: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853757.09443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853757.09453: variable 'omit' from source: magic vars 30583 1726853757.09906: variable 'ansible_distribution_major_version' from source: facts 30583 1726853757.09917: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853757.09923: variable 'omit' from source: magic vars 30583 1726853757.10009: variable 'omit' from source: magic vars 30583 1726853757.10106: variable 'omit' from source: magic vars 30583 1726853757.10145: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853757.10419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853757.10423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853757.10425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853757.10428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853757.10706: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853757.10710: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853757.10713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853757.10716: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853757.10719: Set connection var ansible_timeout to 10 30583 1726853757.10721: Set connection var ansible_connection to ssh 30583 1726853757.10723: Set connection var ansible_shell_executable to /bin/sh 30583 1726853757.10725: Set connection var ansible_shell_type to sh 30583 1726853757.10731: Set connection var ansible_pipelining to False 30583 1726853757.10734: variable 'ansible_shell_executable' from source: unknown 30583 1726853757.10736: variable 'ansible_connection' from source: unknown 30583 1726853757.10738: variable 'ansible_module_compression' from source: unknown 30583 1726853757.10740: variable 'ansible_shell_type' from source: unknown 30583 1726853757.10742: variable 'ansible_shell_executable' from source: unknown 30583 1726853757.10744: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853757.10747: variable 'ansible_pipelining' from source: unknown 30583 1726853757.10749: variable 'ansible_timeout' from source: unknown 30583 1726853757.10751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853757.11145: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853757.11155: variable 'omit' from source: magic vars 30583 1726853757.11162: starting attempt loop 30583 1726853757.11170: running the handler 30583 1726853757.11185: _low_level_execute_command(): starting 30583 1726853757.11193: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853757.12551: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853757.12556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853757.12560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853757.12625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853757.14524: stdout chunk (state=3): >>>/root <<< 30583 1726853757.14528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853757.14536: stdout chunk (state=3): >>><<< 30583 1726853757.14539: stderr chunk (state=3): >>><<< 30583 1726853757.14572: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853757.14587: _low_level_execute_command(): starting 30583 1726853757.14594: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266 `" && echo ansible-tmp-1726853757.1457224-34942-216592107244266="` echo /root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266 `" ) && sleep 0' 30583 1726853757.15976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853757.15980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853757.15983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853757.15986: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853757.15996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853757.16209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853757.16318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853757.18387: stdout chunk (state=3): >>>ansible-tmp-1726853757.1457224-34942-216592107244266=/root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266 <<< 30583 1726853757.18636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853757.18685: stderr chunk (state=3): >>><<< 30583 1726853757.18689: stdout chunk (state=3): >>><<< 30583 1726853757.18717: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853757.1457224-34942-216592107244266=/root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853757.18765: variable 'ansible_module_compression' from source: unknown 30583 1726853757.18819: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853757.19087: variable 'ansible_facts' from source: unknown 30583 1726853757.19457: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266/AnsiballZ_package_facts.py 30583 1726853757.19988: Sending initial data 30583 1726853757.19992: Sent initial data (162 bytes) 30583 1726853757.21115: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853757.21224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853757.21323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853757.21336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853757.21389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853757.21490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853757.23184: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853757.23250: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853757.23398: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpy4tm6y9r /root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266/AnsiballZ_package_facts.py <<< 30583 1726853757.23402: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266/AnsiballZ_package_facts.py" <<< 30583 1726853757.23465: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpy4tm6y9r" to remote "/root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266/AnsiballZ_package_facts.py" <<< 30583 1726853757.26447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853757.26452: stderr chunk (state=3): >>><<< 30583 1726853757.26454: stdout chunk (state=3): >>><<< 30583 1726853757.26482: done transferring module to remote 30583 1726853757.26496: _low_level_execute_command(): starting 30583 1726853757.26501: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266/ /root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266/AnsiballZ_package_facts.py && sleep 0' 30583 1726853757.27617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853757.27621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853757.27690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853757.27694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853757.27884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853757.27901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853757.27909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853757.28007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853757.29913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853757.29954: stderr chunk (state=3): >>><<< 30583 1726853757.29957: stdout chunk (state=3): >>><<< 30583 1726853757.29988: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853757.29991: _low_level_execute_command(): starting 30583 1726853757.29994: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266/AnsiballZ_package_facts.py && sleep 0' 30583 1726853757.31378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853757.31382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853757.31403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853757.31409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853757.31512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853757.76784: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30583 1726853757.76853: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30583 1726853757.76878: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853757.78851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853757.78865: stdout chunk (state=3): >>><<< 30583 1726853757.78882: stderr chunk (state=3): >>><<< 30583 1726853757.79278: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853757.82980: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853757.83016: _low_level_execute_command(): starting 30583 1726853757.83027: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853757.1457224-34942-216592107244266/ > /dev/null 2>&1 && sleep 0' 30583 1726853757.83685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853757.83700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853757.83716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853757.83769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853757.83837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853757.83857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853757.83891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853757.84003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853757.86113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853757.86139: stderr chunk (state=3): >>><<< 30583 1726853757.86142: stdout chunk (state=3): >>><<< 30583 1726853757.86160: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853757.86176: handler run complete 30583 1726853757.87719: variable 'ansible_facts' from source: unknown 30583 1726853757.88222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853757.90473: variable 'ansible_facts' from source: unknown 30583 1726853757.91399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853757.92254: attempt loop complete, returning result 30583 1726853757.92278: _execute() done 30583 1726853757.92286: dumping result to json 30583 1726853757.92608: done dumping result, returning 30583 1726853757.92629: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-000000001b98] 30583 1726853757.92641: sending task result for task 02083763-bbaf-05ea-abc5-000000001b98 30583 1726853757.96427: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b98 30583 1726853757.96438: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853757.96605: no more pending results, returning what we have 30583 1726853757.96608: results queue empty 30583 1726853757.96609: checking for any_errors_fatal 30583 1726853757.96617: done checking for any_errors_fatal 30583 1726853757.96618: checking for max_fail_percentage 30583 1726853757.96620: done checking for max_fail_percentage 30583 1726853757.96621: checking to see if all hosts have failed and the running result is not ok 30583 1726853757.96621: done checking to see if all hosts have failed 30583 1726853757.96622: getting the remaining hosts for this loop 30583 1726853757.96624: done getting the remaining hosts for this loop 30583 1726853757.96627: getting the next task for host managed_node2 30583 1726853757.96635: done getting next task for host managed_node2 30583 1726853757.96639: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853757.96645: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853757.96665: getting variables 30583 1726853757.96667: in VariableManager get_vars() 30583 1726853757.96706: Calling all_inventory to load vars for managed_node2 30583 1726853757.96710: Calling groups_inventory to load vars for managed_node2 30583 1726853757.96712: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853757.96721: Calling all_plugins_play to load vars for managed_node2 30583 1726853757.96724: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853757.96727: Calling groups_plugins_play to load vars for managed_node2 30583 1726853757.99107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853758.03380: done with get_vars() 30583 1726853758.03419: done getting variables 30583 1726853758.03647: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:58 -0400 (0:00:00.955) 0:01:33.374 ****** 30583 1726853758.03759: entering _queue_task() for managed_node2/debug 30583 1726853758.04963: worker is 1 (out of 1 available) 30583 1726853758.04978: exiting _queue_task() for managed_node2/debug 30583 1726853758.04989: done queuing things up, now waiting for results queue to drain 30583 1726853758.04991: waiting for pending results... 30583 1726853758.05483: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853758.05878: in run() - task 02083763-bbaf-05ea-abc5-000000001b3c 30583 1726853758.05881: variable 'ansible_search_path' from source: unknown 30583 1726853758.05884: variable 'ansible_search_path' from source: unknown 30583 1726853758.05886: calling self._execute() 30583 1726853758.06066: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853758.06070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853758.06083: variable 'omit' from source: magic vars 30583 1726853758.07316: variable 'ansible_distribution_major_version' from source: facts 30583 1726853758.07439: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853758.07452: variable 'omit' from source: magic vars 30583 1726853758.07520: variable 'omit' from source: magic vars 30583 1726853758.07951: variable 'network_provider' from source: set_fact 30583 1726853758.08092: variable 'omit' from source: magic vars 30583 1726853758.08139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853758.08782: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853758.08786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853758.08789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853758.08792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853758.08794: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853758.08797: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853758.08799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853758.08996: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853758.09027: Set connection var ansible_timeout to 10 30583 1726853758.09031: Set connection var ansible_connection to ssh 30583 1726853758.09034: Set connection var ansible_shell_executable to /bin/sh 30583 1726853758.09038: Set connection var ansible_shell_type to sh 30583 1726853758.09088: Set connection var ansible_pipelining to False 30583 1726853758.09091: variable 'ansible_shell_executable' from source: unknown 30583 1726853758.09094: variable 'ansible_connection' from source: unknown 30583 1726853758.09096: variable 'ansible_module_compression' from source: unknown 30583 1726853758.09306: variable 'ansible_shell_type' from source: unknown 30583 1726853758.09309: variable 'ansible_shell_executable' from source: unknown 30583 1726853758.09312: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853758.09314: variable 'ansible_pipelining' from source: unknown 30583 1726853758.09316: variable 'ansible_timeout' from source: unknown 30583 1726853758.09318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853758.09638: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853758.09642: variable 'omit' from source: magic vars 30583 1726853758.09742: starting attempt loop 30583 1726853758.09745: running the handler 30583 1726853758.09956: handler run complete 30583 1726853758.09960: attempt loop complete, returning result 30583 1726853758.09963: _execute() done 30583 1726853758.09966: dumping result to json 30583 1726853758.09968: done dumping result, returning 30583 1726853758.09970: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-000000001b3c] 30583 1726853758.09974: sending task result for task 02083763-bbaf-05ea-abc5-000000001b3c 30583 1726853758.10042: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b3c 30583 1726853758.10045: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853758.10136: no more pending results, returning what we have 30583 1726853758.10141: results queue empty 30583 1726853758.10142: checking for any_errors_fatal 30583 1726853758.10153: done checking for any_errors_fatal 30583 1726853758.10153: checking for max_fail_percentage 30583 1726853758.10155: done checking for max_fail_percentage 30583 1726853758.10156: checking to see if all hosts have failed and the running result is not ok 30583 1726853758.10157: done checking to see if all hosts have failed 30583 1726853758.10158: getting the remaining hosts for this loop 30583 1726853758.10160: done getting the remaining hosts for this loop 30583 1726853758.10164: getting the next task for host managed_node2 30583 1726853758.10179: done getting next task for host managed_node2 30583 1726853758.10183: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853758.10190: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853758.10202: getting variables 30583 1726853758.10204: in VariableManager get_vars() 30583 1726853758.10247: Calling all_inventory to load vars for managed_node2 30583 1726853758.10250: Calling groups_inventory to load vars for managed_node2 30583 1726853758.10252: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853758.10261: Calling all_plugins_play to load vars for managed_node2 30583 1726853758.10264: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853758.10266: Calling groups_plugins_play to load vars for managed_node2 30583 1726853758.14547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853758.19703: done with get_vars() 30583 1726853758.19737: done getting variables 30583 1726853758.19910: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:58 -0400 (0:00:00.161) 0:01:33.536 ****** 30583 1726853758.19957: entering _queue_task() for managed_node2/fail 30583 1726853758.21214: worker is 1 (out of 1 available) 30583 1726853758.21229: exiting _queue_task() for managed_node2/fail 30583 1726853758.21242: done queuing things up, now waiting for results queue to drain 30583 1726853758.21243: waiting for pending results... 30583 1726853758.21880: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853758.22137: in run() - task 02083763-bbaf-05ea-abc5-000000001b3d 30583 1726853758.22151: variable 'ansible_search_path' from source: unknown 30583 1726853758.22155: variable 'ansible_search_path' from source: unknown 30583 1726853758.22404: calling self._execute() 30583 1726853758.22494: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853758.22498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853758.22511: variable 'omit' from source: magic vars 30583 1726853758.23769: variable 'ansible_distribution_major_version' from source: facts 30583 1726853758.23936: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853758.24624: variable 'network_state' from source: role '' defaults 30583 1726853758.24628: Evaluated conditional (network_state != {}): False 30583 1726853758.24631: when evaluation is False, skipping this task 30583 1726853758.24634: _execute() done 30583 1726853758.24638: dumping result to json 30583 1726853758.24640: done dumping result, returning 30583 1726853758.24643: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-000000001b3d] 30583 1726853758.24646: sending task result for task 02083763-bbaf-05ea-abc5-000000001b3d 30583 1726853758.24719: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b3d 30583 1726853758.24722: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853758.24776: no more pending results, returning what we have 30583 1726853758.24781: results queue empty 30583 1726853758.24782: checking for any_errors_fatal 30583 1726853758.24790: done checking for any_errors_fatal 30583 1726853758.24791: checking for max_fail_percentage 30583 1726853758.24794: done checking for max_fail_percentage 30583 1726853758.24794: checking to see if all hosts have failed and the running result is not ok 30583 1726853758.24795: done checking to see if all hosts have failed 30583 1726853758.24796: getting the remaining hosts for this loop 30583 1726853758.24798: done getting the remaining hosts for this loop 30583 1726853758.24802: getting the next task for host managed_node2 30583 1726853758.24811: done getting next task for host managed_node2 30583 1726853758.24815: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853758.24821: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853758.24859: getting variables 30583 1726853758.24861: in VariableManager get_vars() 30583 1726853758.24909: Calling all_inventory to load vars for managed_node2 30583 1726853758.24912: Calling groups_inventory to load vars for managed_node2 30583 1726853758.24914: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853758.24925: Calling all_plugins_play to load vars for managed_node2 30583 1726853758.24927: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853758.24930: Calling groups_plugins_play to load vars for managed_node2 30583 1726853758.29769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853758.32978: done with get_vars() 30583 1726853758.33016: done getting variables 30583 1726853758.33085: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:58 -0400 (0:00:00.131) 0:01:33.668 ****** 30583 1726853758.33120: entering _queue_task() for managed_node2/fail 30583 1726853758.34068: worker is 1 (out of 1 available) 30583 1726853758.34091: exiting _queue_task() for managed_node2/fail 30583 1726853758.34104: done queuing things up, now waiting for results queue to drain 30583 1726853758.34105: waiting for pending results... 30583 1726853758.34794: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853758.35014: in run() - task 02083763-bbaf-05ea-abc5-000000001b3e 30583 1726853758.35042: variable 'ansible_search_path' from source: unknown 30583 1726853758.35148: variable 'ansible_search_path' from source: unknown 30583 1726853758.35152: calling self._execute() 30583 1726853758.35223: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853758.35241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853758.35260: variable 'omit' from source: magic vars 30583 1726853758.35691: variable 'ansible_distribution_major_version' from source: facts 30583 1726853758.35707: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853758.35833: variable 'network_state' from source: role '' defaults 30583 1726853758.35849: Evaluated conditional (network_state != {}): False 30583 1726853758.35856: when evaluation is False, skipping this task 30583 1726853758.35863: _execute() done 30583 1726853758.35869: dumping result to json 30583 1726853758.35879: done dumping result, returning 30583 1726853758.35896: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-000000001b3e] 30583 1726853758.35907: sending task result for task 02083763-bbaf-05ea-abc5-000000001b3e skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853758.36157: no more pending results, returning what we have 30583 1726853758.36161: results queue empty 30583 1726853758.36162: checking for any_errors_fatal 30583 1726853758.36174: done checking for any_errors_fatal 30583 1726853758.36175: checking for max_fail_percentage 30583 1726853758.36177: done checking for max_fail_percentage 30583 1726853758.36178: checking to see if all hosts have failed and the running result is not ok 30583 1726853758.36179: done checking to see if all hosts have failed 30583 1726853758.36180: getting the remaining hosts for this loop 30583 1726853758.36182: done getting the remaining hosts for this loop 30583 1726853758.36186: getting the next task for host managed_node2 30583 1726853758.36195: done getting next task for host managed_node2 30583 1726853758.36199: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853758.36204: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853758.36252: getting variables 30583 1726853758.36255: in VariableManager get_vars() 30583 1726853758.36356: Calling all_inventory to load vars for managed_node2 30583 1726853758.36359: Calling groups_inventory to load vars for managed_node2 30583 1726853758.36361: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853758.36445: Calling all_plugins_play to load vars for managed_node2 30583 1726853758.36448: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853758.36451: Calling groups_plugins_play to load vars for managed_node2 30583 1726853758.37137: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b3e 30583 1726853758.37140: WORKER PROCESS EXITING 30583 1726853758.39658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853758.42968: done with get_vars() 30583 1726853758.43119: done getting variables 30583 1726853758.43184: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:58 -0400 (0:00:00.101) 0:01:33.770 ****** 30583 1726853758.43304: entering _queue_task() for managed_node2/fail 30583 1726853758.44034: worker is 1 (out of 1 available) 30583 1726853758.44049: exiting _queue_task() for managed_node2/fail 30583 1726853758.44061: done queuing things up, now waiting for results queue to drain 30583 1726853758.44063: waiting for pending results... 30583 1726853758.44690: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853758.45253: in run() - task 02083763-bbaf-05ea-abc5-000000001b3f 30583 1726853758.45256: variable 'ansible_search_path' from source: unknown 30583 1726853758.45259: variable 'ansible_search_path' from source: unknown 30583 1726853758.45261: calling self._execute() 30583 1726853758.45418: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853758.45431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853758.45447: variable 'omit' from source: magic vars 30583 1726853758.46362: variable 'ansible_distribution_major_version' from source: facts 30583 1726853758.46383: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853758.46672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853758.52277: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853758.52486: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853758.52722: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853758.52725: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853758.52728: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853758.52894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853758.52931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853758.53157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853758.53161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853758.53164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853758.53351: variable 'ansible_distribution_major_version' from source: facts 30583 1726853758.53578: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853758.53634: variable 'ansible_distribution' from source: facts 30583 1726853758.53686: variable '__network_rh_distros' from source: role '' defaults 30583 1726853758.53702: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853758.54309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853758.54556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853758.54559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853758.54562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853758.54564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853758.54709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853758.54737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853758.54769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853758.54995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853758.54998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853758.55000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853758.55177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853758.55180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853758.55224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853758.55377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853758.55939: variable 'network_connections' from source: include params 30583 1726853758.56095: variable 'interface' from source: play vars 30583 1726853758.56167: variable 'interface' from source: play vars 30583 1726853758.56217: variable 'network_state' from source: role '' defaults 30583 1726853758.56292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853758.56719: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853758.56833: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853758.56885: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853758.56994: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853758.57048: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853758.57143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853758.57243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853758.57276: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853758.57366: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853758.57394: when evaluation is False, skipping this task 30583 1726853758.57397: _execute() done 30583 1726853758.57400: dumping result to json 30583 1726853758.57403: done dumping result, returning 30583 1726853758.57502: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-000000001b3f] 30583 1726853758.57506: sending task result for task 02083763-bbaf-05ea-abc5-000000001b3f 30583 1726853758.57809: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b3f 30583 1726853758.57813: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853758.57865: no more pending results, returning what we have 30583 1726853758.57868: results queue empty 30583 1726853758.57869: checking for any_errors_fatal 30583 1726853758.57878: done checking for any_errors_fatal 30583 1726853758.57879: checking for max_fail_percentage 30583 1726853758.57882: done checking for max_fail_percentage 30583 1726853758.57883: checking to see if all hosts have failed and the running result is not ok 30583 1726853758.57883: done checking to see if all hosts have failed 30583 1726853758.57884: getting the remaining hosts for this loop 30583 1726853758.57886: done getting the remaining hosts for this loop 30583 1726853758.57890: getting the next task for host managed_node2 30583 1726853758.57898: done getting next task for host managed_node2 30583 1726853758.57903: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853758.57908: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853758.57942: getting variables 30583 1726853758.57945: in VariableManager get_vars() 30583 1726853758.58278: Calling all_inventory to load vars for managed_node2 30583 1726853758.58282: Calling groups_inventory to load vars for managed_node2 30583 1726853758.58285: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853758.58294: Calling all_plugins_play to load vars for managed_node2 30583 1726853758.58298: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853758.58301: Calling groups_plugins_play to load vars for managed_node2 30583 1726853758.62219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853758.66491: done with get_vars() 30583 1726853758.66692: done getting variables 30583 1726853758.66769: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:58 -0400 (0:00:00.235) 0:01:34.005 ****** 30583 1726853758.66953: entering _queue_task() for managed_node2/dnf 30583 1726853758.68279: worker is 1 (out of 1 available) 30583 1726853758.68292: exiting _queue_task() for managed_node2/dnf 30583 1726853758.68302: done queuing things up, now waiting for results queue to drain 30583 1726853758.68304: waiting for pending results... 30583 1726853758.69253: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853758.69420: in run() - task 02083763-bbaf-05ea-abc5-000000001b40 30583 1726853758.69780: variable 'ansible_search_path' from source: unknown 30583 1726853758.69784: variable 'ansible_search_path' from source: unknown 30583 1726853758.69787: calling self._execute() 30583 1726853758.70033: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853758.70046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853758.70062: variable 'omit' from source: magic vars 30583 1726853758.71293: variable 'ansible_distribution_major_version' from source: facts 30583 1726853758.71678: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853758.72173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853758.76638: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853758.76869: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853758.76875: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853758.77281: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853758.77285: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853758.77308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853758.77356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853758.77386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853758.77544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853758.77559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853758.77897: variable 'ansible_distribution' from source: facts 30583 1726853758.77900: variable 'ansible_distribution_major_version' from source: facts 30583 1726853758.77918: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853758.78302: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853758.78579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853758.78599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853758.78743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853758.78847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853758.78863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853758.78903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853758.78924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853758.79073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853758.79108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853758.79121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853758.79280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853758.79338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853758.79342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853758.79360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853758.79556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853758.79775: variable 'network_connections' from source: include params 30583 1726853758.79787: variable 'interface' from source: play vars 30583 1726853758.80057: variable 'interface' from source: play vars 30583 1726853758.80109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853758.80588: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853758.80715: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853758.80745: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853758.80779: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853758.80933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853758.80954: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853758.81023: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853758.81050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853758.81256: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853758.82323: variable 'network_connections' from source: include params 30583 1726853758.82327: variable 'interface' from source: play vars 30583 1726853758.82669: variable 'interface' from source: play vars 30583 1726853758.82724: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853758.82727: when evaluation is False, skipping this task 30583 1726853758.82730: _execute() done 30583 1726853758.82733: dumping result to json 30583 1726853758.82735: done dumping result, returning 30583 1726853758.82744: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001b40] 30583 1726853758.82754: sending task result for task 02083763-bbaf-05ea-abc5-000000001b40 30583 1726853758.83068: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b40 30583 1726853758.83192: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853758.83334: no more pending results, returning what we have 30583 1726853758.83338: results queue empty 30583 1726853758.83339: checking for any_errors_fatal 30583 1726853758.83346: done checking for any_errors_fatal 30583 1726853758.83347: checking for max_fail_percentage 30583 1726853758.83349: done checking for max_fail_percentage 30583 1726853758.83350: checking to see if all hosts have failed and the running result is not ok 30583 1726853758.83351: done checking to see if all hosts have failed 30583 1726853758.83352: getting the remaining hosts for this loop 30583 1726853758.83354: done getting the remaining hosts for this loop 30583 1726853758.83360: getting the next task for host managed_node2 30583 1726853758.83369: done getting next task for host managed_node2 30583 1726853758.83376: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853758.83381: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853758.83415: getting variables 30583 1726853758.83417: in VariableManager get_vars() 30583 1726853758.83465: Calling all_inventory to load vars for managed_node2 30583 1726853758.83467: Calling groups_inventory to load vars for managed_node2 30583 1726853758.83470: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853758.83786: Calling all_plugins_play to load vars for managed_node2 30583 1726853758.83789: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853758.83793: Calling groups_plugins_play to load vars for managed_node2 30583 1726853758.85978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853758.88834: done with get_vars() 30583 1726853758.88961: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853758.89104: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:58 -0400 (0:00:00.221) 0:01:34.228 ****** 30583 1726853758.89143: entering _queue_task() for managed_node2/yum 30583 1726853758.89604: worker is 1 (out of 1 available) 30583 1726853758.89616: exiting _queue_task() for managed_node2/yum 30583 1726853758.89628: done queuing things up, now waiting for results queue to drain 30583 1726853758.89630: waiting for pending results... 30583 1726853758.89924: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853758.90138: in run() - task 02083763-bbaf-05ea-abc5-000000001b41 30583 1726853758.90142: variable 'ansible_search_path' from source: unknown 30583 1726853758.90144: variable 'ansible_search_path' from source: unknown 30583 1726853758.90184: calling self._execute() 30583 1726853758.90331: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853758.90335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853758.90337: variable 'omit' from source: magic vars 30583 1726853758.90797: variable 'ansible_distribution_major_version' from source: facts 30583 1726853758.90814: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853758.91229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853758.94966: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853758.95047: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853758.95094: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853758.95137: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853758.95269: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853758.95281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853758.95324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853758.95356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853758.95415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853758.95434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853758.95551: variable 'ansible_distribution_major_version' from source: facts 30583 1726853758.95591: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853758.95676: when evaluation is False, skipping this task 30583 1726853758.95680: _execute() done 30583 1726853758.95682: dumping result to json 30583 1726853758.95684: done dumping result, returning 30583 1726853758.95687: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001b41] 30583 1726853758.95691: sending task result for task 02083763-bbaf-05ea-abc5-000000001b41 30583 1726853758.95779: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b41 30583 1726853758.95782: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853758.95843: no more pending results, returning what we have 30583 1726853758.95846: results queue empty 30583 1726853758.95848: checking for any_errors_fatal 30583 1726853758.95856: done checking for any_errors_fatal 30583 1726853758.95857: checking for max_fail_percentage 30583 1726853758.95861: done checking for max_fail_percentage 30583 1726853758.95863: checking to see if all hosts have failed and the running result is not ok 30583 1726853758.95863: done checking to see if all hosts have failed 30583 1726853758.95864: getting the remaining hosts for this loop 30583 1726853758.95866: done getting the remaining hosts for this loop 30583 1726853758.95870: getting the next task for host managed_node2 30583 1726853758.95932: done getting next task for host managed_node2 30583 1726853758.95938: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853758.95943: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853758.95980: getting variables 30583 1726853758.95982: in VariableManager get_vars() 30583 1726853758.96207: Calling all_inventory to load vars for managed_node2 30583 1726853758.96211: Calling groups_inventory to load vars for managed_node2 30583 1726853758.96214: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853758.96225: Calling all_plugins_play to load vars for managed_node2 30583 1726853758.96228: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853758.96231: Calling groups_plugins_play to load vars for managed_node2 30583 1726853758.98050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853759.00092: done with get_vars() 30583 1726853759.00119: done getting variables 30583 1726853759.00179: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:59 -0400 (0:00:00.110) 0:01:34.339 ****** 30583 1726853759.00216: entering _queue_task() for managed_node2/fail 30583 1726853759.00535: worker is 1 (out of 1 available) 30583 1726853759.00549: exiting _queue_task() for managed_node2/fail 30583 1726853759.00561: done queuing things up, now waiting for results queue to drain 30583 1726853759.00563: waiting for pending results... 30583 1726853759.00785: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853759.00889: in run() - task 02083763-bbaf-05ea-abc5-000000001b42 30583 1726853759.00901: variable 'ansible_search_path' from source: unknown 30583 1726853759.00904: variable 'ansible_search_path' from source: unknown 30583 1726853759.00956: calling self._execute() 30583 1726853759.01031: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853759.01036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853759.01045: variable 'omit' from source: magic vars 30583 1726853759.01343: variable 'ansible_distribution_major_version' from source: facts 30583 1726853759.01352: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853759.01501: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853759.01712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853759.04779: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853759.04782: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853759.04813: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853759.04852: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853759.04900: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853759.04992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.05510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.05542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.05597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.05619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.05669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.05710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.05778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.05789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.05818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.05866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.05900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.05939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.06031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.06034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.06232: variable 'network_connections' from source: include params 30583 1726853759.06267: variable 'interface' from source: play vars 30583 1726853759.06380: variable 'interface' from source: play vars 30583 1726853759.06552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853759.06924: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853759.07277: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853759.07283: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853759.07332: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853759.07577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853759.07585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853759.07678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.07795: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853759.07868: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853759.08326: variable 'network_connections' from source: include params 30583 1726853759.08329: variable 'interface' from source: play vars 30583 1726853759.08390: variable 'interface' from source: play vars 30583 1726853759.08425: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853759.08429: when evaluation is False, skipping this task 30583 1726853759.08431: _execute() done 30583 1726853759.08434: dumping result to json 30583 1726853759.08445: done dumping result, returning 30583 1726853759.08448: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001b42] 30583 1726853759.08450: sending task result for task 02083763-bbaf-05ea-abc5-000000001b42 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853759.08681: no more pending results, returning what we have 30583 1726853759.08684: results queue empty 30583 1726853759.08685: checking for any_errors_fatal 30583 1726853759.08692: done checking for any_errors_fatal 30583 1726853759.08693: checking for max_fail_percentage 30583 1726853759.08696: done checking for max_fail_percentage 30583 1726853759.08697: checking to see if all hosts have failed and the running result is not ok 30583 1726853759.08697: done checking to see if all hosts have failed 30583 1726853759.08698: getting the remaining hosts for this loop 30583 1726853759.08700: done getting the remaining hosts for this loop 30583 1726853759.08704: getting the next task for host managed_node2 30583 1726853759.08712: done getting next task for host managed_node2 30583 1726853759.08715: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853759.08719: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853759.08750: getting variables 30583 1726853759.08751: in VariableManager get_vars() 30583 1726853759.08798: Calling all_inventory to load vars for managed_node2 30583 1726853759.08801: Calling groups_inventory to load vars for managed_node2 30583 1726853759.08803: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853759.08813: Calling all_plugins_play to load vars for managed_node2 30583 1726853759.08816: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853759.08818: Calling groups_plugins_play to load vars for managed_node2 30583 1726853759.09386: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b42 30583 1726853759.09393: WORKER PROCESS EXITING 30583 1726853759.10065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853759.11330: done with get_vars() 30583 1726853759.11361: done getting variables 30583 1726853759.11428: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:59 -0400 (0:00:00.112) 0:01:34.451 ****** 30583 1726853759.11480: entering _queue_task() for managed_node2/package 30583 1726853759.11844: worker is 1 (out of 1 available) 30583 1726853759.11856: exiting _queue_task() for managed_node2/package 30583 1726853759.11870: done queuing things up, now waiting for results queue to drain 30583 1726853759.12076: waiting for pending results... 30583 1726853759.12214: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853759.12292: in run() - task 02083763-bbaf-05ea-abc5-000000001b43 30583 1726853759.12311: variable 'ansible_search_path' from source: unknown 30583 1726853759.12315: variable 'ansible_search_path' from source: unknown 30583 1726853759.12376: calling self._execute() 30583 1726853759.12441: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853759.12445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853759.12455: variable 'omit' from source: magic vars 30583 1726853759.12831: variable 'ansible_distribution_major_version' from source: facts 30583 1726853759.12861: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853759.13020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853759.13229: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853759.13261: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853759.13294: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853759.13344: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853759.13431: variable 'network_packages' from source: role '' defaults 30583 1726853759.13510: variable '__network_provider_setup' from source: role '' defaults 30583 1726853759.13518: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853759.13565: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853759.13574: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853759.13619: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853759.13755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853759.15134: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853759.15183: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853759.15210: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853759.15234: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853759.15253: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853759.15316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.15335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.15353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.15388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.15397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.15427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.15444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.15460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.15491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.15503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.15645: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853759.15763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.15799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.15833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.15892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.15911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.15976: variable 'ansible_python' from source: facts 30583 1726853759.16007: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853759.16097: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853759.16186: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853759.16356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.16390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.16418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.16440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.16470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.16516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.16537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.16579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.16609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.16622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.16791: variable 'network_connections' from source: include params 30583 1726853759.16804: variable 'interface' from source: play vars 30583 1726853759.16894: variable 'interface' from source: play vars 30583 1726853759.16990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853759.17033: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853759.17052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.17094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853759.17161: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853759.17359: variable 'network_connections' from source: include params 30583 1726853759.17366: variable 'interface' from source: play vars 30583 1726853759.17439: variable 'interface' from source: play vars 30583 1726853759.17461: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853759.17520: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853759.17811: variable 'network_connections' from source: include params 30583 1726853759.17814: variable 'interface' from source: play vars 30583 1726853759.17876: variable 'interface' from source: play vars 30583 1726853759.17920: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853759.17996: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853759.18316: variable 'network_connections' from source: include params 30583 1726853759.18319: variable 'interface' from source: play vars 30583 1726853759.18350: variable 'interface' from source: play vars 30583 1726853759.18404: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853759.18448: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853759.18453: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853759.18499: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853759.18644: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853759.19051: variable 'network_connections' from source: include params 30583 1726853759.19054: variable 'interface' from source: play vars 30583 1726853759.19104: variable 'interface' from source: play vars 30583 1726853759.19111: variable 'ansible_distribution' from source: facts 30583 1726853759.19113: variable '__network_rh_distros' from source: role '' defaults 30583 1726853759.19136: variable 'ansible_distribution_major_version' from source: facts 30583 1726853759.19139: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853759.19296: variable 'ansible_distribution' from source: facts 30583 1726853759.19299: variable '__network_rh_distros' from source: role '' defaults 30583 1726853759.19301: variable 'ansible_distribution_major_version' from source: facts 30583 1726853759.19315: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853759.19438: variable 'ansible_distribution' from source: facts 30583 1726853759.19441: variable '__network_rh_distros' from source: role '' defaults 30583 1726853759.19446: variable 'ansible_distribution_major_version' from source: facts 30583 1726853759.19488: variable 'network_provider' from source: set_fact 30583 1726853759.19496: variable 'ansible_facts' from source: unknown 30583 1726853759.20018: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853759.20021: when evaluation is False, skipping this task 30583 1726853759.20023: _execute() done 30583 1726853759.20026: dumping result to json 30583 1726853759.20028: done dumping result, returning 30583 1726853759.20036: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-000000001b43] 30583 1726853759.20040: sending task result for task 02083763-bbaf-05ea-abc5-000000001b43 30583 1726853759.20133: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b43 30583 1726853759.20135: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853759.20191: no more pending results, returning what we have 30583 1726853759.20195: results queue empty 30583 1726853759.20196: checking for any_errors_fatal 30583 1726853759.20203: done checking for any_errors_fatal 30583 1726853759.20204: checking for max_fail_percentage 30583 1726853759.20206: done checking for max_fail_percentage 30583 1726853759.20207: checking to see if all hosts have failed and the running result is not ok 30583 1726853759.20208: done checking to see if all hosts have failed 30583 1726853759.20208: getting the remaining hosts for this loop 30583 1726853759.20210: done getting the remaining hosts for this loop 30583 1726853759.20213: getting the next task for host managed_node2 30583 1726853759.20221: done getting next task for host managed_node2 30583 1726853759.20225: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853759.20230: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853759.20267: getting variables 30583 1726853759.20269: in VariableManager get_vars() 30583 1726853759.20315: Calling all_inventory to load vars for managed_node2 30583 1726853759.20317: Calling groups_inventory to load vars for managed_node2 30583 1726853759.20319: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853759.20329: Calling all_plugins_play to load vars for managed_node2 30583 1726853759.20331: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853759.20334: Calling groups_plugins_play to load vars for managed_node2 30583 1726853759.21172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853759.22175: done with get_vars() 30583 1726853759.22191: done getting variables 30583 1726853759.22232: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:59 -0400 (0:00:00.107) 0:01:34.559 ****** 30583 1726853759.22258: entering _queue_task() for managed_node2/package 30583 1726853759.22517: worker is 1 (out of 1 available) 30583 1726853759.22531: exiting _queue_task() for managed_node2/package 30583 1726853759.22544: done queuing things up, now waiting for results queue to drain 30583 1726853759.22546: waiting for pending results... 30583 1726853759.22763: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853759.22863: in run() - task 02083763-bbaf-05ea-abc5-000000001b44 30583 1726853759.22880: variable 'ansible_search_path' from source: unknown 30583 1726853759.22883: variable 'ansible_search_path' from source: unknown 30583 1726853759.22911: calling self._execute() 30583 1726853759.22988: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853759.22995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853759.23002: variable 'omit' from source: magic vars 30583 1726853759.23302: variable 'ansible_distribution_major_version' from source: facts 30583 1726853759.23311: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853759.23398: variable 'network_state' from source: role '' defaults 30583 1726853759.23406: Evaluated conditional (network_state != {}): False 30583 1726853759.23409: when evaluation is False, skipping this task 30583 1726853759.23412: _execute() done 30583 1726853759.23414: dumping result to json 30583 1726853759.23417: done dumping result, returning 30583 1726853759.23425: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000001b44] 30583 1726853759.23429: sending task result for task 02083763-bbaf-05ea-abc5-000000001b44 30583 1726853759.23519: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b44 30583 1726853759.23521: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853759.23588: no more pending results, returning what we have 30583 1726853759.23592: results queue empty 30583 1726853759.23593: checking for any_errors_fatal 30583 1726853759.23598: done checking for any_errors_fatal 30583 1726853759.23599: checking for max_fail_percentage 30583 1726853759.23601: done checking for max_fail_percentage 30583 1726853759.23602: checking to see if all hosts have failed and the running result is not ok 30583 1726853759.23603: done checking to see if all hosts have failed 30583 1726853759.23604: getting the remaining hosts for this loop 30583 1726853759.23605: done getting the remaining hosts for this loop 30583 1726853759.23608: getting the next task for host managed_node2 30583 1726853759.23616: done getting next task for host managed_node2 30583 1726853759.23620: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853759.23624: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853759.23649: getting variables 30583 1726853759.23650: in VariableManager get_vars() 30583 1726853759.23684: Calling all_inventory to load vars for managed_node2 30583 1726853759.23686: Calling groups_inventory to load vars for managed_node2 30583 1726853759.23688: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853759.23696: Calling all_plugins_play to load vars for managed_node2 30583 1726853759.23698: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853759.23700: Calling groups_plugins_play to load vars for managed_node2 30583 1726853759.24728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853759.25677: done with get_vars() 30583 1726853759.25692: done getting variables 30583 1726853759.25734: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:59 -0400 (0:00:00.035) 0:01:34.595 ****** 30583 1726853759.25785: entering _queue_task() for managed_node2/package 30583 1726853759.26070: worker is 1 (out of 1 available) 30583 1726853759.26087: exiting _queue_task() for managed_node2/package 30583 1726853759.26100: done queuing things up, now waiting for results queue to drain 30583 1726853759.26102: waiting for pending results... 30583 1726853759.26325: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853759.26448: in run() - task 02083763-bbaf-05ea-abc5-000000001b45 30583 1726853759.26454: variable 'ansible_search_path' from source: unknown 30583 1726853759.26480: variable 'ansible_search_path' from source: unknown 30583 1726853759.26504: calling self._execute() 30583 1726853759.26596: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853759.26600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853759.26608: variable 'omit' from source: magic vars 30583 1726853759.26980: variable 'ansible_distribution_major_version' from source: facts 30583 1726853759.26989: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853759.27092: variable 'network_state' from source: role '' defaults 30583 1726853759.27100: Evaluated conditional (network_state != {}): False 30583 1726853759.27103: when evaluation is False, skipping this task 30583 1726853759.27106: _execute() done 30583 1726853759.27108: dumping result to json 30583 1726853759.27111: done dumping result, returning 30583 1726853759.27121: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000001b45] 30583 1726853759.27124: sending task result for task 02083763-bbaf-05ea-abc5-000000001b45 30583 1726853759.27222: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b45 30583 1726853759.27228: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853759.27317: no more pending results, returning what we have 30583 1726853759.27320: results queue empty 30583 1726853759.27321: checking for any_errors_fatal 30583 1726853759.27327: done checking for any_errors_fatal 30583 1726853759.27327: checking for max_fail_percentage 30583 1726853759.27329: done checking for max_fail_percentage 30583 1726853759.27330: checking to see if all hosts have failed and the running result is not ok 30583 1726853759.27331: done checking to see if all hosts have failed 30583 1726853759.27331: getting the remaining hosts for this loop 30583 1726853759.27333: done getting the remaining hosts for this loop 30583 1726853759.27340: getting the next task for host managed_node2 30583 1726853759.27348: done getting next task for host managed_node2 30583 1726853759.27352: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853759.27356: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853759.27424: getting variables 30583 1726853759.27426: in VariableManager get_vars() 30583 1726853759.27459: Calling all_inventory to load vars for managed_node2 30583 1726853759.27462: Calling groups_inventory to load vars for managed_node2 30583 1726853759.27464: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853759.27482: Calling all_plugins_play to load vars for managed_node2 30583 1726853759.27486: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853759.27489: Calling groups_plugins_play to load vars for managed_node2 30583 1726853759.28402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853759.29657: done with get_vars() 30583 1726853759.29675: done getting variables 30583 1726853759.29749: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:59 -0400 (0:00:00.040) 0:01:34.635 ****** 30583 1726853759.29791: entering _queue_task() for managed_node2/service 30583 1726853759.30153: worker is 1 (out of 1 available) 30583 1726853759.30169: exiting _queue_task() for managed_node2/service 30583 1726853759.30183: done queuing things up, now waiting for results queue to drain 30583 1726853759.30185: waiting for pending results... 30583 1726853759.30469: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853759.30546: in run() - task 02083763-bbaf-05ea-abc5-000000001b46 30583 1726853759.30558: variable 'ansible_search_path' from source: unknown 30583 1726853759.30561: variable 'ansible_search_path' from source: unknown 30583 1726853759.30595: calling self._execute() 30583 1726853759.30678: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853759.30681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853759.30691: variable 'omit' from source: magic vars 30583 1726853759.31019: variable 'ansible_distribution_major_version' from source: facts 30583 1726853759.31027: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853759.31117: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853759.31256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853759.32898: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853759.32960: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853759.33012: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853759.33044: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853759.33153: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853759.33196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.33215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.33236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.33264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.33277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.33323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.33341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.33363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.33386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.33397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.33423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.33443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.33465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.33501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.33517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.33645: variable 'network_connections' from source: include params 30583 1726853759.33657: variable 'interface' from source: play vars 30583 1726853759.33727: variable 'interface' from source: play vars 30583 1726853759.33783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853759.33919: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853759.33952: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853759.33979: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853759.33999: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853759.34032: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853759.34047: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853759.34065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.34095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853759.34136: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853759.34330: variable 'network_connections' from source: include params 30583 1726853759.34335: variable 'interface' from source: play vars 30583 1726853759.34387: variable 'interface' from source: play vars 30583 1726853759.34407: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853759.34411: when evaluation is False, skipping this task 30583 1726853759.34413: _execute() done 30583 1726853759.34417: dumping result to json 30583 1726853759.34419: done dumping result, returning 30583 1726853759.34427: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001b46] 30583 1726853759.34430: sending task result for task 02083763-bbaf-05ea-abc5-000000001b46 30583 1726853759.34537: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b46 30583 1726853759.34551: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853759.34610: no more pending results, returning what we have 30583 1726853759.34614: results queue empty 30583 1726853759.34615: checking for any_errors_fatal 30583 1726853759.34621: done checking for any_errors_fatal 30583 1726853759.34622: checking for max_fail_percentage 30583 1726853759.34624: done checking for max_fail_percentage 30583 1726853759.34625: checking to see if all hosts have failed and the running result is not ok 30583 1726853759.34625: done checking to see if all hosts have failed 30583 1726853759.34626: getting the remaining hosts for this loop 30583 1726853759.34628: done getting the remaining hosts for this loop 30583 1726853759.34632: getting the next task for host managed_node2 30583 1726853759.34644: done getting next task for host managed_node2 30583 1726853759.34648: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853759.34653: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853759.34682: getting variables 30583 1726853759.34683: in VariableManager get_vars() 30583 1726853759.34726: Calling all_inventory to load vars for managed_node2 30583 1726853759.34732: Calling groups_inventory to load vars for managed_node2 30583 1726853759.34734: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853759.34743: Calling all_plugins_play to load vars for managed_node2 30583 1726853759.34745: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853759.34748: Calling groups_plugins_play to load vars for managed_node2 30583 1726853759.35720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853759.36742: done with get_vars() 30583 1726853759.36762: done getting variables 30583 1726853759.36826: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:59 -0400 (0:00:00.070) 0:01:34.705 ****** 30583 1726853759.36853: entering _queue_task() for managed_node2/service 30583 1726853759.37123: worker is 1 (out of 1 available) 30583 1726853759.37281: exiting _queue_task() for managed_node2/service 30583 1726853759.37294: done queuing things up, now waiting for results queue to drain 30583 1726853759.37295: waiting for pending results... 30583 1726853759.37485: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853759.37698: in run() - task 02083763-bbaf-05ea-abc5-000000001b47 30583 1726853759.37702: variable 'ansible_search_path' from source: unknown 30583 1726853759.37704: variable 'ansible_search_path' from source: unknown 30583 1726853759.37710: calling self._execute() 30583 1726853759.37873: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853759.37879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853759.37889: variable 'omit' from source: magic vars 30583 1726853759.38395: variable 'ansible_distribution_major_version' from source: facts 30583 1726853759.38406: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853759.38587: variable 'network_provider' from source: set_fact 30583 1726853759.38598: variable 'network_state' from source: role '' defaults 30583 1726853759.38607: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853759.38614: variable 'omit' from source: magic vars 30583 1726853759.38680: variable 'omit' from source: magic vars 30583 1726853759.38702: variable 'network_service_name' from source: role '' defaults 30583 1726853759.38749: variable 'network_service_name' from source: role '' defaults 30583 1726853759.38828: variable '__network_provider_setup' from source: role '' defaults 30583 1726853759.38832: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853759.38879: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853759.38886: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853759.38934: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853759.39084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853759.41116: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853759.41120: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853759.41123: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853759.41125: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853759.41127: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853759.41211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.41215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.41239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.41282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.41285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.41453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.41456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.41459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.41461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.41464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.41646: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853759.41780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.41783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.41801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.41837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.41851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.41954: variable 'ansible_python' from source: facts 30583 1726853759.41957: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853759.42055: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853759.42133: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853759.42396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.42434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.42447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.42691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.42705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.42749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853759.42775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853759.42805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.42873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853759.42877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853759.43203: variable 'network_connections' from source: include params 30583 1726853759.43207: variable 'interface' from source: play vars 30583 1726853759.43320: variable 'interface' from source: play vars 30583 1726853759.43448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853759.43645: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853759.43905: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853759.43949: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853759.44301: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853759.44744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853759.44787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853759.44843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853759.44891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853759.44961: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853759.45377: variable 'network_connections' from source: include params 30583 1726853759.45381: variable 'interface' from source: play vars 30583 1726853759.45458: variable 'interface' from source: play vars 30583 1726853759.45600: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853759.45617: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853759.45913: variable 'network_connections' from source: include params 30583 1726853759.45944: variable 'interface' from source: play vars 30583 1726853759.46029: variable 'interface' from source: play vars 30583 1726853759.46056: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853759.46140: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853759.46426: variable 'network_connections' from source: include params 30583 1726853759.46435: variable 'interface' from source: play vars 30583 1726853759.46506: variable 'interface' from source: play vars 30583 1726853759.46560: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853759.46624: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853759.46634: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853759.46699: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853759.47020: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853759.47578: variable 'network_connections' from source: include params 30583 1726853759.47581: variable 'interface' from source: play vars 30583 1726853759.47584: variable 'interface' from source: play vars 30583 1726853759.47586: variable 'ansible_distribution' from source: facts 30583 1726853759.47588: variable '__network_rh_distros' from source: role '' defaults 30583 1726853759.47590: variable 'ansible_distribution_major_version' from source: facts 30583 1726853759.47592: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853759.47726: variable 'ansible_distribution' from source: facts 30583 1726853759.47735: variable '__network_rh_distros' from source: role '' defaults 30583 1726853759.47745: variable 'ansible_distribution_major_version' from source: facts 30583 1726853759.47762: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853759.47933: variable 'ansible_distribution' from source: facts 30583 1726853759.47942: variable '__network_rh_distros' from source: role '' defaults 30583 1726853759.47950: variable 'ansible_distribution_major_version' from source: facts 30583 1726853759.47994: variable 'network_provider' from source: set_fact 30583 1726853759.48023: variable 'omit' from source: magic vars 30583 1726853759.48054: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853759.48092: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853759.48111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853759.48131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853759.48144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853759.48179: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853759.48188: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853759.48195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853759.48300: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853759.48314: Set connection var ansible_timeout to 10 30583 1726853759.48321: Set connection var ansible_connection to ssh 30583 1726853759.48380: Set connection var ansible_shell_executable to /bin/sh 30583 1726853759.48383: Set connection var ansible_shell_type to sh 30583 1726853759.48385: Set connection var ansible_pipelining to False 30583 1726853759.48388: variable 'ansible_shell_executable' from source: unknown 30583 1726853759.48390: variable 'ansible_connection' from source: unknown 30583 1726853759.48393: variable 'ansible_module_compression' from source: unknown 30583 1726853759.48400: variable 'ansible_shell_type' from source: unknown 30583 1726853759.48405: variable 'ansible_shell_executable' from source: unknown 30583 1726853759.48411: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853759.48419: variable 'ansible_pipelining' from source: unknown 30583 1726853759.48427: variable 'ansible_timeout' from source: unknown 30583 1726853759.48434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853759.48536: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853759.48570: variable 'omit' from source: magic vars 30583 1726853759.48575: starting attempt loop 30583 1726853759.48578: running the handler 30583 1726853759.48625: variable 'ansible_facts' from source: unknown 30583 1726853759.49378: _low_level_execute_command(): starting 30583 1726853759.49381: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853759.49962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853759.49968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853759.50025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853759.50028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853759.50033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853759.50107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853759.52213: stdout chunk (state=3): >>>/root <<< 30583 1726853759.52217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853759.52219: stdout chunk (state=3): >>><<< 30583 1726853759.52221: stderr chunk (state=3): >>><<< 30583 1726853759.52224: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853759.52226: _low_level_execute_command(): starting 30583 1726853759.52230: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774 `" && echo ansible-tmp-1726853759.5212328-35025-153292705389774="` echo /root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774 `" ) && sleep 0' 30583 1726853759.53338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853759.53354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853759.53377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853759.53395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853759.53415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853759.53427: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853759.53444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853759.53465: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853759.53761: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853759.53856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853759.53970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853759.56017: stdout chunk (state=3): >>>ansible-tmp-1726853759.5212328-35025-153292705389774=/root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774 <<< 30583 1726853759.56155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853759.56169: stdout chunk (state=3): >>><<< 30583 1726853759.56184: stderr chunk (state=3): >>><<< 30583 1726853759.56205: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853759.5212328-35025-153292705389774=/root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853759.56245: variable 'ansible_module_compression' from source: unknown 30583 1726853759.56476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853759.56499: variable 'ansible_facts' from source: unknown 30583 1726853759.57193: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774/AnsiballZ_systemd.py 30583 1726853759.57392: Sending initial data 30583 1726853759.57402: Sent initial data (156 bytes) 30583 1726853759.58524: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853759.58881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853759.58961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853759.59183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853759.60910: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853759.60992: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853759.61187: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpwov945oz /root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774/AnsiballZ_systemd.py <<< 30583 1726853759.61389: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774/AnsiballZ_systemd.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpwov945oz" to remote "/root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774/AnsiballZ_systemd.py" <<< 30583 1726853759.64843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853759.64849: stdout chunk (state=3): >>><<< 30583 1726853759.64852: stderr chunk (state=3): >>><<< 30583 1726853759.64854: done transferring module to remote 30583 1726853759.64856: _low_level_execute_command(): starting 30583 1726853759.64858: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774/ /root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774/AnsiballZ_systemd.py && sleep 0' 30583 1726853759.66104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853759.66120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853759.66131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853759.66305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853759.66318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853759.66415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853759.68350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853759.68382: stderr chunk (state=3): >>><<< 30583 1726853759.68389: stdout chunk (state=3): >>><<< 30583 1726853759.68425: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853759.68428: _low_level_execute_command(): starting 30583 1726853759.68436: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774/AnsiballZ_systemd.py && sleep 0' 30583 1726853759.69104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853759.69184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853759.69277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853759.69280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853759.69283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853759.69389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853759.99442: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4657152", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305644032", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1950930000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 30583 1726853759.99452: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "syst<<< 30583 1726853759.99455: stdout chunk (state=3): >>>em.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853760.01566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853760.01570: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853760.01579: stdout chunk (state=3): >>><<< 30583 1726853760.01582: stderr chunk (state=3): >>><<< 30583 1726853760.01586: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4657152", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305644032", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1950930000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853760.01857: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853760.01883: _low_level_execute_command(): starting 30583 1726853760.01901: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853759.5212328-35025-153292705389774/ > /dev/null 2>&1 && sleep 0' 30583 1726853760.02744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853760.02750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853760.02752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853760.02754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853760.02906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853760.04850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853760.04881: stderr chunk (state=3): >>><<< 30583 1726853760.04959: stdout chunk (state=3): >>><<< 30583 1726853760.04980: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853760.04986: handler run complete 30583 1726853760.05191: attempt loop complete, returning result 30583 1726853760.05194: _execute() done 30583 1726853760.05197: dumping result to json 30583 1726853760.05219: done dumping result, returning 30583 1726853760.05225: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-000000001b47] 30583 1726853760.05230: sending task result for task 02083763-bbaf-05ea-abc5-000000001b47 30583 1726853760.05625: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b47 30583 1726853760.05629: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853760.05704: no more pending results, returning what we have 30583 1726853760.05714: results queue empty 30583 1726853760.05715: checking for any_errors_fatal 30583 1726853760.05721: done checking for any_errors_fatal 30583 1726853760.05722: checking for max_fail_percentage 30583 1726853760.05724: done checking for max_fail_percentage 30583 1726853760.05725: checking to see if all hosts have failed and the running result is not ok 30583 1726853760.05726: done checking to see if all hosts have failed 30583 1726853760.05726: getting the remaining hosts for this loop 30583 1726853760.05728: done getting the remaining hosts for this loop 30583 1726853760.05732: getting the next task for host managed_node2 30583 1726853760.05740: done getting next task for host managed_node2 30583 1726853760.05744: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853760.05749: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853760.05766: getting variables 30583 1726853760.05768: in VariableManager get_vars() 30583 1726853760.06133: Calling all_inventory to load vars for managed_node2 30583 1726853760.06137: Calling groups_inventory to load vars for managed_node2 30583 1726853760.06140: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853760.06150: Calling all_plugins_play to load vars for managed_node2 30583 1726853760.06153: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853760.06164: Calling groups_plugins_play to load vars for managed_node2 30583 1726853760.10792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853760.13018: done with get_vars() 30583 1726853760.13053: done getting variables 30583 1726853760.13428: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:36:00 -0400 (0:00:00.766) 0:01:35.471 ****** 30583 1726853760.13473: entering _queue_task() for managed_node2/service 30583 1726853760.14316: worker is 1 (out of 1 available) 30583 1726853760.14328: exiting _queue_task() for managed_node2/service 30583 1726853760.14340: done queuing things up, now waiting for results queue to drain 30583 1726853760.14341: waiting for pending results... 30583 1726853760.15203: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853760.15622: in run() - task 02083763-bbaf-05ea-abc5-000000001b48 30583 1726853760.15626: variable 'ansible_search_path' from source: unknown 30583 1726853760.15629: variable 'ansible_search_path' from source: unknown 30583 1726853760.15631: calling self._execute() 30583 1726853760.15793: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853760.15886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853760.15902: variable 'omit' from source: magic vars 30583 1726853760.16329: variable 'ansible_distribution_major_version' from source: facts 30583 1726853760.16346: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853760.16505: variable 'network_provider' from source: set_fact 30583 1726853760.16532: Evaluated conditional (network_provider == "nm"): True 30583 1726853760.16644: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853760.16764: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853760.16957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853760.19427: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853760.19578: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853760.19615: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853760.19877: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853760.19882: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853760.19975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853760.20218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853760.20222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853760.20225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853760.20228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853760.20279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853760.20309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853760.20368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853760.20413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853760.20438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853760.20563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853760.20593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853760.20622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853760.20879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853760.20882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853760.21068: variable 'network_connections' from source: include params 30583 1726853760.21193: variable 'interface' from source: play vars 30583 1726853760.21265: variable 'interface' from source: play vars 30583 1726853760.21362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853760.21825: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853760.21929: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853760.21974: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853760.22028: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853760.22087: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853760.22114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853760.22144: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853760.22189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853760.22243: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853760.22514: variable 'network_connections' from source: include params 30583 1726853760.22525: variable 'interface' from source: play vars 30583 1726853760.22596: variable 'interface' from source: play vars 30583 1726853760.22631: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853760.22675: when evaluation is False, skipping this task 30583 1726853760.22678: _execute() done 30583 1726853760.22681: dumping result to json 30583 1726853760.22684: done dumping result, returning 30583 1726853760.22686: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-000000001b48] 30583 1726853760.22697: sending task result for task 02083763-bbaf-05ea-abc5-000000001b48 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853760.22859: no more pending results, returning what we have 30583 1726853760.22863: results queue empty 30583 1726853760.22864: checking for any_errors_fatal 30583 1726853760.22881: done checking for any_errors_fatal 30583 1726853760.22882: checking for max_fail_percentage 30583 1726853760.22884: done checking for max_fail_percentage 30583 1726853760.22885: checking to see if all hosts have failed and the running result is not ok 30583 1726853760.22885: done checking to see if all hosts have failed 30583 1726853760.22886: getting the remaining hosts for this loop 30583 1726853760.22888: done getting the remaining hosts for this loop 30583 1726853760.22891: getting the next task for host managed_node2 30583 1726853760.22900: done getting next task for host managed_node2 30583 1726853760.22903: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853760.22908: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853760.22992: getting variables 30583 1726853760.22994: in VariableManager get_vars() 30583 1726853760.23092: Calling all_inventory to load vars for managed_node2 30583 1726853760.23095: Calling groups_inventory to load vars for managed_node2 30583 1726853760.23097: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853760.23152: Calling all_plugins_play to load vars for managed_node2 30583 1726853760.23156: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853760.23158: Calling groups_plugins_play to load vars for managed_node2 30583 1726853760.23737: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b48 30583 1726853760.23740: WORKER PROCESS EXITING 30583 1726853760.25834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853760.28267: done with get_vars() 30583 1726853760.28489: done getting variables 30583 1726853760.28742: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:36:00 -0400 (0:00:00.153) 0:01:35.625 ****** 30583 1726853760.28781: entering _queue_task() for managed_node2/service 30583 1726853760.29574: worker is 1 (out of 1 available) 30583 1726853760.29595: exiting _queue_task() for managed_node2/service 30583 1726853760.29609: done queuing things up, now waiting for results queue to drain 30583 1726853760.29610: waiting for pending results... 30583 1726853760.30070: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853760.30204: in run() - task 02083763-bbaf-05ea-abc5-000000001b49 30583 1726853760.30219: variable 'ansible_search_path' from source: unknown 30583 1726853760.30223: variable 'ansible_search_path' from source: unknown 30583 1726853760.30260: calling self._execute() 30583 1726853760.30360: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853760.30368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853760.30464: variable 'omit' from source: magic vars 30583 1726853760.31149: variable 'ansible_distribution_major_version' from source: facts 30583 1726853760.31161: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853760.31483: variable 'network_provider' from source: set_fact 30583 1726853760.31489: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853760.31492: when evaluation is False, skipping this task 30583 1726853760.31494: _execute() done 30583 1726853760.31497: dumping result to json 30583 1726853760.31499: done dumping result, returning 30583 1726853760.31509: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-000000001b49] 30583 1726853760.31511: sending task result for task 02083763-bbaf-05ea-abc5-000000001b49 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853760.31823: no more pending results, returning what we have 30583 1726853760.31827: results queue empty 30583 1726853760.31834: checking for any_errors_fatal 30583 1726853760.31845: done checking for any_errors_fatal 30583 1726853760.31846: checking for max_fail_percentage 30583 1726853760.31935: done checking for max_fail_percentage 30583 1726853760.31938: checking to see if all hosts have failed and the running result is not ok 30583 1726853760.31939: done checking to see if all hosts have failed 30583 1726853760.31939: getting the remaining hosts for this loop 30583 1726853760.31942: done getting the remaining hosts for this loop 30583 1726853760.31946: getting the next task for host managed_node2 30583 1726853760.31956: done getting next task for host managed_node2 30583 1726853760.32079: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853760.32085: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853760.32112: getting variables 30583 1726853760.32114: in VariableManager get_vars() 30583 1726853760.32156: Calling all_inventory to load vars for managed_node2 30583 1726853760.32160: Calling groups_inventory to load vars for managed_node2 30583 1726853760.32162: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853760.32276: Calling all_plugins_play to load vars for managed_node2 30583 1726853760.32284: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853760.32290: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b49 30583 1726853760.32292: WORKER PROCESS EXITING 30583 1726853760.32296: Calling groups_plugins_play to load vars for managed_node2 30583 1726853760.34000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853760.36095: done with get_vars() 30583 1726853760.36121: done getting variables 30583 1726853760.36193: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:36:00 -0400 (0:00:00.074) 0:01:35.699 ****** 30583 1726853760.36229: entering _queue_task() for managed_node2/copy 30583 1726853760.36903: worker is 1 (out of 1 available) 30583 1726853760.37023: exiting _queue_task() for managed_node2/copy 30583 1726853760.37036: done queuing things up, now waiting for results queue to drain 30583 1726853760.37037: waiting for pending results... 30583 1726853760.37315: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853760.37548: in run() - task 02083763-bbaf-05ea-abc5-000000001b4a 30583 1726853760.37566: variable 'ansible_search_path' from source: unknown 30583 1726853760.37644: variable 'ansible_search_path' from source: unknown 30583 1726853760.37731: calling self._execute() 30583 1726853760.37915: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853760.37924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853760.37934: variable 'omit' from source: magic vars 30583 1726853760.38816: variable 'ansible_distribution_major_version' from source: facts 30583 1726853760.38863: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853760.38998: variable 'network_provider' from source: set_fact 30583 1726853760.39015: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853760.39018: when evaluation is False, skipping this task 30583 1726853760.39025: _execute() done 30583 1726853760.39029: dumping result to json 30583 1726853760.39031: done dumping result, returning 30583 1726853760.39043: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-000000001b4a] 30583 1726853760.39046: sending task result for task 02083763-bbaf-05ea-abc5-000000001b4a skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853760.39607: no more pending results, returning what we have 30583 1726853760.39611: results queue empty 30583 1726853760.39612: checking for any_errors_fatal 30583 1726853760.39619: done checking for any_errors_fatal 30583 1726853760.39620: checking for max_fail_percentage 30583 1726853760.39622: done checking for max_fail_percentage 30583 1726853760.39623: checking to see if all hosts have failed and the running result is not ok 30583 1726853760.39623: done checking to see if all hosts have failed 30583 1726853760.39624: getting the remaining hosts for this loop 30583 1726853760.39625: done getting the remaining hosts for this loop 30583 1726853760.39629: getting the next task for host managed_node2 30583 1726853760.39636: done getting next task for host managed_node2 30583 1726853760.39646: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853760.39653: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853760.39678: getting variables 30583 1726853760.39680: in VariableManager get_vars() 30583 1726853760.39722: Calling all_inventory to load vars for managed_node2 30583 1726853760.39725: Calling groups_inventory to load vars for managed_node2 30583 1726853760.39727: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853760.39735: Calling all_plugins_play to load vars for managed_node2 30583 1726853760.39738: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853760.39740: Calling groups_plugins_play to load vars for managed_node2 30583 1726853760.39758: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b4a 30583 1726853760.39761: WORKER PROCESS EXITING 30583 1726853760.41456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853760.43112: done with get_vars() 30583 1726853760.43141: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:36:00 -0400 (0:00:00.070) 0:01:35.769 ****** 30583 1726853760.43243: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853760.43610: worker is 1 (out of 1 available) 30583 1726853760.43631: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853760.43643: done queuing things up, now waiting for results queue to drain 30583 1726853760.43644: waiting for pending results... 30583 1726853760.43964: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853760.44120: in run() - task 02083763-bbaf-05ea-abc5-000000001b4b 30583 1726853760.44140: variable 'ansible_search_path' from source: unknown 30583 1726853760.44147: variable 'ansible_search_path' from source: unknown 30583 1726853760.44202: calling self._execute() 30583 1726853760.44326: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853760.44338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853760.44354: variable 'omit' from source: magic vars 30583 1726853760.44830: variable 'ansible_distribution_major_version' from source: facts 30583 1726853760.44876: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853760.44879: variable 'omit' from source: magic vars 30583 1726853760.44919: variable 'omit' from source: magic vars 30583 1726853760.45156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853760.55116: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853760.55205: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853760.55376: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853760.55380: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853760.55383: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853760.55404: variable 'network_provider' from source: set_fact 30583 1726853760.55541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853760.55579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853760.55623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853760.55669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853760.55690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853760.55773: variable 'omit' from source: magic vars 30583 1726853760.55894: variable 'omit' from source: magic vars 30583 1726853760.56005: variable 'network_connections' from source: include params 30583 1726853760.56024: variable 'interface' from source: play vars 30583 1726853760.56155: variable 'interface' from source: play vars 30583 1726853760.56239: variable 'omit' from source: magic vars 30583 1726853760.56251: variable '__lsr_ansible_managed' from source: task vars 30583 1726853760.56331: variable '__lsr_ansible_managed' from source: task vars 30583 1726853760.56577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853760.56739: Loaded config def from plugin (lookup/template) 30583 1726853760.56748: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853760.56780: File lookup term: get_ansible_managed.j2 30583 1726853760.56794: variable 'ansible_search_path' from source: unknown 30583 1726853760.56814: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853760.56830: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853760.56850: variable 'ansible_search_path' from source: unknown 30583 1726853760.63849: variable 'ansible_managed' from source: unknown 30583 1726853760.64010: variable 'omit' from source: magic vars 30583 1726853760.64097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853760.64105: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853760.64108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853760.64116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853760.64129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853760.64152: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853760.64160: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853760.64168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853760.64269: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853760.64289: Set connection var ansible_timeout to 10 30583 1726853760.64313: Set connection var ansible_connection to ssh 30583 1726853760.64316: Set connection var ansible_shell_executable to /bin/sh 30583 1726853760.64422: Set connection var ansible_shell_type to sh 30583 1726853760.64425: Set connection var ansible_pipelining to False 30583 1726853760.64427: variable 'ansible_shell_executable' from source: unknown 30583 1726853760.64429: variable 'ansible_connection' from source: unknown 30583 1726853760.64432: variable 'ansible_module_compression' from source: unknown 30583 1726853760.64434: variable 'ansible_shell_type' from source: unknown 30583 1726853760.64436: variable 'ansible_shell_executable' from source: unknown 30583 1726853760.64438: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853760.64439: variable 'ansible_pipelining' from source: unknown 30583 1726853760.64441: variable 'ansible_timeout' from source: unknown 30583 1726853760.64443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853760.64544: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853760.64567: variable 'omit' from source: magic vars 30583 1726853760.64580: starting attempt loop 30583 1726853760.64586: running the handler 30583 1726853760.64600: _low_level_execute_command(): starting 30583 1726853760.64609: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853760.65400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853760.65455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853760.65474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853760.65490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853760.65614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853760.67527: stdout chunk (state=3): >>>/root <<< 30583 1726853760.67548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853760.67608: stderr chunk (state=3): >>><<< 30583 1726853760.67737: stdout chunk (state=3): >>><<< 30583 1726853760.67741: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853760.67743: _low_level_execute_command(): starting 30583 1726853760.67746: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095 `" && echo ansible-tmp-1726853760.67652-35085-163261107030095="` echo /root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095 `" ) && sleep 0' 30583 1726853760.68294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853760.68308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853760.68323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853760.68339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853760.68356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853760.68373: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853760.68470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853760.68495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853760.68612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853760.70688: stdout chunk (state=3): >>>ansible-tmp-1726853760.67652-35085-163261107030095=/root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095 <<< 30583 1726853760.70838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853760.70842: stdout chunk (state=3): >>><<< 30583 1726853760.70844: stderr chunk (state=3): >>><<< 30583 1726853760.71098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853760.67652-35085-163261107030095=/root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853760.71102: variable 'ansible_module_compression' from source: unknown 30583 1726853760.71104: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853760.71107: variable 'ansible_facts' from source: unknown 30583 1726853760.71201: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095/AnsiballZ_network_connections.py 30583 1726853760.71362: Sending initial data 30583 1726853760.71481: Sent initial data (166 bytes) 30583 1726853760.72345: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853760.72394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853760.72429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853760.72564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853760.72618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853760.72677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853760.72786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853760.74509: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853760.74621: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853760.74719: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpc371062y /root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095/AnsiballZ_network_connections.py <<< 30583 1726853760.74722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095/AnsiballZ_network_connections.py" <<< 30583 1726853760.74834: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpc371062y" to remote "/root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095/AnsiballZ_network_connections.py" <<< 30583 1726853760.76276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853760.76449: stderr chunk (state=3): >>><<< 30583 1726853760.76453: stdout chunk (state=3): >>><<< 30583 1726853760.76455: done transferring module to remote 30583 1726853760.76457: _low_level_execute_command(): starting 30583 1726853760.76459: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095/ /root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095/AnsiballZ_network_connections.py && sleep 0' 30583 1726853760.77369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853760.77415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853760.77432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853760.77464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853760.77587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853760.79862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853760.79866: stdout chunk (state=3): >>><<< 30583 1726853760.79868: stderr chunk (state=3): >>><<< 30583 1726853760.79873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853760.79876: _low_level_execute_command(): starting 30583 1726853760.79878: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095/AnsiballZ_network_connections.py && sleep 0' 30583 1726853760.80438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853760.80452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853760.80467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853760.80488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853760.80504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853760.80515: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853760.80616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853760.80649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853760.80769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853761.09111: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kkq6mukb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kkq6mukb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/5c3c483d-e950-47f9-9afb-d5e74f691954: error=unknown <<< 30583 1726853761.09151: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30583 1726853761.11073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853761.11169: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853761.11176: stdout chunk (state=3): >>><<< 30583 1726853761.11179: stderr chunk (state=3): >>><<< 30583 1726853761.11514: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kkq6mukb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kkq6mukb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/5c3c483d-e950-47f9-9afb-d5e74f691954: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853761.11518: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853761.11520: _low_level_execute_command(): starting 30583 1726853761.11522: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853760.67652-35085-163261107030095/ > /dev/null 2>&1 && sleep 0' 30583 1726853761.12527: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853761.12531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853761.12533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853761.12535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853761.12662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853761.12797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853761.12948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853761.14898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853761.14935: stderr chunk (state=3): >>><<< 30583 1726853761.14938: stdout chunk (state=3): >>><<< 30583 1726853761.14955: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853761.14963: handler run complete 30583 1726853761.14994: attempt loop complete, returning result 30583 1726853761.14997: _execute() done 30583 1726853761.14999: dumping result to json 30583 1726853761.15001: done dumping result, returning 30583 1726853761.15012: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-000000001b4b] 30583 1726853761.15020: sending task result for task 02083763-bbaf-05ea-abc5-000000001b4b 30583 1726853761.15203: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b4b 30583 1726853761.15206: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30583 1726853761.15308: no more pending results, returning what we have 30583 1726853761.15311: results queue empty 30583 1726853761.15312: checking for any_errors_fatal 30583 1726853761.15320: done checking for any_errors_fatal 30583 1726853761.15321: checking for max_fail_percentage 30583 1726853761.15323: done checking for max_fail_percentage 30583 1726853761.15324: checking to see if all hosts have failed and the running result is not ok 30583 1726853761.15324: done checking to see if all hosts have failed 30583 1726853761.15325: getting the remaining hosts for this loop 30583 1726853761.15327: done getting the remaining hosts for this loop 30583 1726853761.15331: getting the next task for host managed_node2 30583 1726853761.15338: done getting next task for host managed_node2 30583 1726853761.15341: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853761.15346: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853761.15362: getting variables 30583 1726853761.15363: in VariableManager get_vars() 30583 1726853761.15615: Calling all_inventory to load vars for managed_node2 30583 1726853761.15618: Calling groups_inventory to load vars for managed_node2 30583 1726853761.15620: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853761.15629: Calling all_plugins_play to load vars for managed_node2 30583 1726853761.15632: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853761.15634: Calling groups_plugins_play to load vars for managed_node2 30583 1726853761.27709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853761.28810: done with get_vars() 30583 1726853761.28832: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:36:01 -0400 (0:00:00.856) 0:01:36.626 ****** 30583 1726853761.28901: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853761.29291: worker is 1 (out of 1 available) 30583 1726853761.29305: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853761.29316: done queuing things up, now waiting for results queue to drain 30583 1726853761.29318: waiting for pending results... 30583 1726853761.29792: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853761.29878: in run() - task 02083763-bbaf-05ea-abc5-000000001b4c 30583 1726853761.29909: variable 'ansible_search_path' from source: unknown 30583 1726853761.29918: variable 'ansible_search_path' from source: unknown 30583 1726853761.29969: calling self._execute() 30583 1726853761.30089: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853761.30113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853761.30223: variable 'omit' from source: magic vars 30583 1726853761.30573: variable 'ansible_distribution_major_version' from source: facts 30583 1726853761.30593: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853761.30737: variable 'network_state' from source: role '' defaults 30583 1726853761.30756: Evaluated conditional (network_state != {}): False 30583 1726853761.30776: when evaluation is False, skipping this task 30583 1726853761.30791: _execute() done 30583 1726853761.30804: dumping result to json 30583 1726853761.30815: done dumping result, returning 30583 1726853761.30830: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-000000001b4c] 30583 1726853761.30841: sending task result for task 02083763-bbaf-05ea-abc5-000000001b4c 30583 1726853761.31007: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b4c 30583 1726853761.31011: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853761.31105: no more pending results, returning what we have 30583 1726853761.31112: results queue empty 30583 1726853761.31113: checking for any_errors_fatal 30583 1726853761.31130: done checking for any_errors_fatal 30583 1726853761.31131: checking for max_fail_percentage 30583 1726853761.31133: done checking for max_fail_percentage 30583 1726853761.31134: checking to see if all hosts have failed and the running result is not ok 30583 1726853761.31135: done checking to see if all hosts have failed 30583 1726853761.31136: getting the remaining hosts for this loop 30583 1726853761.31139: done getting the remaining hosts for this loop 30583 1726853761.31142: getting the next task for host managed_node2 30583 1726853761.31150: done getting next task for host managed_node2 30583 1726853761.31155: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853761.31163: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853761.31199: getting variables 30583 1726853761.31201: in VariableManager get_vars() 30583 1726853761.31247: Calling all_inventory to load vars for managed_node2 30583 1726853761.31249: Calling groups_inventory to load vars for managed_node2 30583 1726853761.31252: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853761.31264: Calling all_plugins_play to load vars for managed_node2 30583 1726853761.31267: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853761.31270: Calling groups_plugins_play to load vars for managed_node2 30583 1726853761.32397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853761.33581: done with get_vars() 30583 1726853761.33617: done getting variables 30583 1726853761.33664: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:36:01 -0400 (0:00:00.047) 0:01:36.674 ****** 30583 1726853761.33698: entering _queue_task() for managed_node2/debug 30583 1726853761.34002: worker is 1 (out of 1 available) 30583 1726853761.34016: exiting _queue_task() for managed_node2/debug 30583 1726853761.34031: done queuing things up, now waiting for results queue to drain 30583 1726853761.34032: waiting for pending results... 30583 1726853761.34248: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853761.34401: in run() - task 02083763-bbaf-05ea-abc5-000000001b4d 30583 1726853761.34413: variable 'ansible_search_path' from source: unknown 30583 1726853761.34416: variable 'ansible_search_path' from source: unknown 30583 1726853761.34446: calling self._execute() 30583 1726853761.34524: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853761.34528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853761.34536: variable 'omit' from source: magic vars 30583 1726853761.34867: variable 'ansible_distribution_major_version' from source: facts 30583 1726853761.34878: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853761.34896: variable 'omit' from source: magic vars 30583 1726853761.34990: variable 'omit' from source: magic vars 30583 1726853761.35041: variable 'omit' from source: magic vars 30583 1726853761.35079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853761.35106: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853761.35123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853761.35136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853761.35148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853761.35177: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853761.35181: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853761.35184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853761.35249: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853761.35254: Set connection var ansible_timeout to 10 30583 1726853761.35257: Set connection var ansible_connection to ssh 30583 1726853761.35262: Set connection var ansible_shell_executable to /bin/sh 30583 1726853761.35264: Set connection var ansible_shell_type to sh 30583 1726853761.35278: Set connection var ansible_pipelining to False 30583 1726853761.35296: variable 'ansible_shell_executable' from source: unknown 30583 1726853761.35299: variable 'ansible_connection' from source: unknown 30583 1726853761.35302: variable 'ansible_module_compression' from source: unknown 30583 1726853761.35304: variable 'ansible_shell_type' from source: unknown 30583 1726853761.35307: variable 'ansible_shell_executable' from source: unknown 30583 1726853761.35309: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853761.35311: variable 'ansible_pipelining' from source: unknown 30583 1726853761.35315: variable 'ansible_timeout' from source: unknown 30583 1726853761.35318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853761.35425: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853761.35433: variable 'omit' from source: magic vars 30583 1726853761.35439: starting attempt loop 30583 1726853761.35442: running the handler 30583 1726853761.35548: variable '__network_connections_result' from source: set_fact 30583 1726853761.35591: handler run complete 30583 1726853761.35609: attempt loop complete, returning result 30583 1726853761.35612: _execute() done 30583 1726853761.35614: dumping result to json 30583 1726853761.35616: done dumping result, returning 30583 1726853761.35624: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-000000001b4d] 30583 1726853761.35626: sending task result for task 02083763-bbaf-05ea-abc5-000000001b4d 30583 1726853761.35719: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b4d 30583 1726853761.35721: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 30583 1726853761.35797: no more pending results, returning what we have 30583 1726853761.35801: results queue empty 30583 1726853761.35802: checking for any_errors_fatal 30583 1726853761.35808: done checking for any_errors_fatal 30583 1726853761.35809: checking for max_fail_percentage 30583 1726853761.35811: done checking for max_fail_percentage 30583 1726853761.35812: checking to see if all hosts have failed and the running result is not ok 30583 1726853761.35813: done checking to see if all hosts have failed 30583 1726853761.35813: getting the remaining hosts for this loop 30583 1726853761.35816: done getting the remaining hosts for this loop 30583 1726853761.35819: getting the next task for host managed_node2 30583 1726853761.35826: done getting next task for host managed_node2 30583 1726853761.35831: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853761.35836: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853761.35849: getting variables 30583 1726853761.35850: in VariableManager get_vars() 30583 1726853761.35895: Calling all_inventory to load vars for managed_node2 30583 1726853761.35898: Calling groups_inventory to load vars for managed_node2 30583 1726853761.35900: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853761.35909: Calling all_plugins_play to load vars for managed_node2 30583 1726853761.35911: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853761.35914: Calling groups_plugins_play to load vars for managed_node2 30583 1726853761.36755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853761.37678: done with get_vars() 30583 1726853761.37699: done getting variables 30583 1726853761.37747: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:36:01 -0400 (0:00:00.040) 0:01:36.715 ****** 30583 1726853761.37783: entering _queue_task() for managed_node2/debug 30583 1726853761.38052: worker is 1 (out of 1 available) 30583 1726853761.38065: exiting _queue_task() for managed_node2/debug 30583 1726853761.38081: done queuing things up, now waiting for results queue to drain 30583 1726853761.38082: waiting for pending results... 30583 1726853761.38291: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853761.38400: in run() - task 02083763-bbaf-05ea-abc5-000000001b4e 30583 1726853761.38417: variable 'ansible_search_path' from source: unknown 30583 1726853761.38422: variable 'ansible_search_path' from source: unknown 30583 1726853761.38449: calling self._execute() 30583 1726853761.38524: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853761.38528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853761.38542: variable 'omit' from source: magic vars 30583 1726853761.38833: variable 'ansible_distribution_major_version' from source: facts 30583 1726853761.38846: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853761.38852: variable 'omit' from source: magic vars 30583 1726853761.38905: variable 'omit' from source: magic vars 30583 1726853761.38929: variable 'omit' from source: magic vars 30583 1726853761.38961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853761.38996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853761.39013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853761.39026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853761.39037: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853761.39060: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853761.39066: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853761.39069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853761.39175: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853761.39178: Set connection var ansible_timeout to 10 30583 1726853761.39212: Set connection var ansible_connection to ssh 30583 1726853761.39234: Set connection var ansible_shell_executable to /bin/sh 30583 1726853761.39237: Set connection var ansible_shell_type to sh 30583 1726853761.39250: Set connection var ansible_pipelining to False 30583 1726853761.39252: variable 'ansible_shell_executable' from source: unknown 30583 1726853761.39254: variable 'ansible_connection' from source: unknown 30583 1726853761.39257: variable 'ansible_module_compression' from source: unknown 30583 1726853761.39259: variable 'ansible_shell_type' from source: unknown 30583 1726853761.39281: variable 'ansible_shell_executable' from source: unknown 30583 1726853761.39285: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853761.39287: variable 'ansible_pipelining' from source: unknown 30583 1726853761.39302: variable 'ansible_timeout' from source: unknown 30583 1726853761.39305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853761.39398: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853761.39407: variable 'omit' from source: magic vars 30583 1726853761.39412: starting attempt loop 30583 1726853761.39415: running the handler 30583 1726853761.39457: variable '__network_connections_result' from source: set_fact 30583 1726853761.39521: variable '__network_connections_result' from source: set_fact 30583 1726853761.39617: handler run complete 30583 1726853761.39654: attempt loop complete, returning result 30583 1726853761.39657: _execute() done 30583 1726853761.39660: dumping result to json 30583 1726853761.39662: done dumping result, returning 30583 1726853761.39665: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-000000001b4e] 30583 1726853761.39675: sending task result for task 02083763-bbaf-05ea-abc5-000000001b4e 30583 1726853761.39800: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b4e 30583 1726853761.39803: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30583 1726853761.39925: no more pending results, returning what we have 30583 1726853761.39928: results queue empty 30583 1726853761.39929: checking for any_errors_fatal 30583 1726853761.39936: done checking for any_errors_fatal 30583 1726853761.39936: checking for max_fail_percentage 30583 1726853761.39938: done checking for max_fail_percentage 30583 1726853761.39939: checking to see if all hosts have failed and the running result is not ok 30583 1726853761.39940: done checking to see if all hosts have failed 30583 1726853761.39940: getting the remaining hosts for this loop 30583 1726853761.39942: done getting the remaining hosts for this loop 30583 1726853761.39945: getting the next task for host managed_node2 30583 1726853761.39952: done getting next task for host managed_node2 30583 1726853761.39956: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853761.39963: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853761.39978: getting variables 30583 1726853761.39980: in VariableManager get_vars() 30583 1726853761.40015: Calling all_inventory to load vars for managed_node2 30583 1726853761.40018: Calling groups_inventory to load vars for managed_node2 30583 1726853761.40020: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853761.40032: Calling all_plugins_play to load vars for managed_node2 30583 1726853761.40035: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853761.40038: Calling groups_plugins_play to load vars for managed_node2 30583 1726853761.41256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853761.42307: done with get_vars() 30583 1726853761.42334: done getting variables 30583 1726853761.42397: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:36:01 -0400 (0:00:00.046) 0:01:36.761 ****** 30583 1726853761.42439: entering _queue_task() for managed_node2/debug 30583 1726853761.42753: worker is 1 (out of 1 available) 30583 1726853761.42767: exiting _queue_task() for managed_node2/debug 30583 1726853761.42782: done queuing things up, now waiting for results queue to drain 30583 1726853761.42783: waiting for pending results... 30583 1726853761.43056: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853761.43144: in run() - task 02083763-bbaf-05ea-abc5-000000001b4f 30583 1726853761.43151: variable 'ansible_search_path' from source: unknown 30583 1726853761.43155: variable 'ansible_search_path' from source: unknown 30583 1726853761.43213: calling self._execute() 30583 1726853761.43293: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853761.43297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853761.43306: variable 'omit' from source: magic vars 30583 1726853761.43678: variable 'ansible_distribution_major_version' from source: facts 30583 1726853761.43690: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853761.43778: variable 'network_state' from source: role '' defaults 30583 1726853761.43785: Evaluated conditional (network_state != {}): False 30583 1726853761.43787: when evaluation is False, skipping this task 30583 1726853761.43791: _execute() done 30583 1726853761.43794: dumping result to json 30583 1726853761.43796: done dumping result, returning 30583 1726853761.43828: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-000000001b4f] 30583 1726853761.43831: sending task result for task 02083763-bbaf-05ea-abc5-000000001b4f 30583 1726853761.43909: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b4f 30583 1726853761.43911: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853761.43962: no more pending results, returning what we have 30583 1726853761.43966: results queue empty 30583 1726853761.43967: checking for any_errors_fatal 30583 1726853761.43979: done checking for any_errors_fatal 30583 1726853761.43980: checking for max_fail_percentage 30583 1726853761.43982: done checking for max_fail_percentage 30583 1726853761.43983: checking to see if all hosts have failed and the running result is not ok 30583 1726853761.43983: done checking to see if all hosts have failed 30583 1726853761.43984: getting the remaining hosts for this loop 30583 1726853761.43986: done getting the remaining hosts for this loop 30583 1726853761.43990: getting the next task for host managed_node2 30583 1726853761.43997: done getting next task for host managed_node2 30583 1726853761.44000: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853761.44006: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853761.44037: getting variables 30583 1726853761.44038: in VariableManager get_vars() 30583 1726853761.44088: Calling all_inventory to load vars for managed_node2 30583 1726853761.44091: Calling groups_inventory to load vars for managed_node2 30583 1726853761.44093: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853761.44104: Calling all_plugins_play to load vars for managed_node2 30583 1726853761.44107: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853761.44110: Calling groups_plugins_play to load vars for managed_node2 30583 1726853761.44919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853761.45864: done with get_vars() 30583 1726853761.45888: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:36:01 -0400 (0:00:00.035) 0:01:36.796 ****** 30583 1726853761.45970: entering _queue_task() for managed_node2/ping 30583 1726853761.46199: worker is 1 (out of 1 available) 30583 1726853761.46214: exiting _queue_task() for managed_node2/ping 30583 1726853761.46226: done queuing things up, now waiting for results queue to drain 30583 1726853761.46228: waiting for pending results... 30583 1726853761.46462: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853761.46552: in run() - task 02083763-bbaf-05ea-abc5-000000001b50 30583 1726853761.46566: variable 'ansible_search_path' from source: unknown 30583 1726853761.46570: variable 'ansible_search_path' from source: unknown 30583 1726853761.46602: calling self._execute() 30583 1726853761.46693: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853761.46698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853761.46701: variable 'omit' from source: magic vars 30583 1726853761.47013: variable 'ansible_distribution_major_version' from source: facts 30583 1726853761.47023: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853761.47029: variable 'omit' from source: magic vars 30583 1726853761.47079: variable 'omit' from source: magic vars 30583 1726853761.47102: variable 'omit' from source: magic vars 30583 1726853761.47136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853761.47167: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853761.47185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853761.47199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853761.47208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853761.47231: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853761.47234: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853761.47236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853761.47312: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853761.47315: Set connection var ansible_timeout to 10 30583 1726853761.47318: Set connection var ansible_connection to ssh 30583 1726853761.47324: Set connection var ansible_shell_executable to /bin/sh 30583 1726853761.47326: Set connection var ansible_shell_type to sh 30583 1726853761.47334: Set connection var ansible_pipelining to False 30583 1726853761.47351: variable 'ansible_shell_executable' from source: unknown 30583 1726853761.47354: variable 'ansible_connection' from source: unknown 30583 1726853761.47356: variable 'ansible_module_compression' from source: unknown 30583 1726853761.47359: variable 'ansible_shell_type' from source: unknown 30583 1726853761.47364: variable 'ansible_shell_executable' from source: unknown 30583 1726853761.47367: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853761.47369: variable 'ansible_pipelining' from source: unknown 30583 1726853761.47373: variable 'ansible_timeout' from source: unknown 30583 1726853761.47385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853761.47535: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853761.47543: variable 'omit' from source: magic vars 30583 1726853761.47549: starting attempt loop 30583 1726853761.47551: running the handler 30583 1726853761.47566: _low_level_execute_command(): starting 30583 1726853761.47574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853761.48070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853761.48098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853761.48102: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853761.48156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853761.48164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853761.48166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853761.48240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853761.49995: stdout chunk (state=3): >>>/root <<< 30583 1726853761.50102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853761.50139: stderr chunk (state=3): >>><<< 30583 1726853761.50142: stdout chunk (state=3): >>><<< 30583 1726853761.50167: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853761.50179: _low_level_execute_command(): starting 30583 1726853761.50184: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152 `" && echo ansible-tmp-1726853761.5016565-35126-34352525733152="` echo /root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152 `" ) && sleep 0' 30583 1726853761.50707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853761.50711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853761.50721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853761.50761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853761.50765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853761.50861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853761.52883: stdout chunk (state=3): >>>ansible-tmp-1726853761.5016565-35126-34352525733152=/root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152 <<< 30583 1726853761.53014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853761.53038: stderr chunk (state=3): >>><<< 30583 1726853761.53041: stdout chunk (state=3): >>><<< 30583 1726853761.53062: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853761.5016565-35126-34352525733152=/root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853761.53108: variable 'ansible_module_compression' from source: unknown 30583 1726853761.53141: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853761.53175: variable 'ansible_facts' from source: unknown 30583 1726853761.53230: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152/AnsiballZ_ping.py 30583 1726853761.53340: Sending initial data 30583 1726853761.53343: Sent initial data (152 bytes) 30583 1726853761.53962: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853761.53965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853761.53967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853761.53975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853761.53977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853761.54032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853761.54035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853761.54038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853761.54129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853761.55830: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853761.55948: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853761.55992: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpq3hs1w8l /root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152/AnsiballZ_ping.py <<< 30583 1726853761.55995: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152/AnsiballZ_ping.py" <<< 30583 1726853761.56212: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpq3hs1w8l" to remote "/root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152/AnsiballZ_ping.py" <<< 30583 1726853761.57283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853761.57329: stderr chunk (state=3): >>><<< 30583 1726853761.57337: stdout chunk (state=3): >>><<< 30583 1726853761.57365: done transferring module to remote 30583 1726853761.57376: _low_level_execute_command(): starting 30583 1726853761.57381: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152/ /root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152/AnsiballZ_ping.py && sleep 0' 30583 1726853761.57798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853761.57802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853761.57804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853761.57850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853761.57862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853761.57928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853761.60062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853761.60088: stderr chunk (state=3): >>><<< 30583 1726853761.60091: stdout chunk (state=3): >>><<< 30583 1726853761.60107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853761.60185: _low_level_execute_command(): starting 30583 1726853761.60189: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152/AnsiballZ_ping.py && sleep 0' 30583 1726853761.60686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853761.60699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853761.60711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853761.60726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853761.60743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853761.60755: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853761.60770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853761.60792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853761.60885: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853761.60898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853761.60914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853761.61025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853761.76791: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853761.78290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853761.78294: stdout chunk (state=3): >>><<< 30583 1726853761.78300: stderr chunk (state=3): >>><<< 30583 1726853761.78319: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853761.78443: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853761.78447: _low_level_execute_command(): starting 30583 1726853761.78450: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853761.5016565-35126-34352525733152/ > /dev/null 2>&1 && sleep 0' 30583 1726853761.79051: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853761.79070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853761.79098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853761.79116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853761.79219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853761.79236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853761.79253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853761.79281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853761.79402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853761.81367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853761.81371: stdout chunk (state=3): >>><<< 30583 1726853761.81380: stderr chunk (state=3): >>><<< 30583 1726853761.81407: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853761.81413: handler run complete 30583 1726853761.81577: attempt loop complete, returning result 30583 1726853761.81580: _execute() done 30583 1726853761.81582: dumping result to json 30583 1726853761.81584: done dumping result, returning 30583 1726853761.81585: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-000000001b50] 30583 1726853761.81587: sending task result for task 02083763-bbaf-05ea-abc5-000000001b50 30583 1726853761.81649: done sending task result for task 02083763-bbaf-05ea-abc5-000000001b50 30583 1726853761.81653: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853761.81838: no more pending results, returning what we have 30583 1726853761.81842: results queue empty 30583 1726853761.81843: checking for any_errors_fatal 30583 1726853761.81848: done checking for any_errors_fatal 30583 1726853761.81849: checking for max_fail_percentage 30583 1726853761.81850: done checking for max_fail_percentage 30583 1726853761.81851: checking to see if all hosts have failed and the running result is not ok 30583 1726853761.81852: done checking to see if all hosts have failed 30583 1726853761.81853: getting the remaining hosts for this loop 30583 1726853761.81854: done getting the remaining hosts for this loop 30583 1726853761.81858: getting the next task for host managed_node2 30583 1726853761.81868: done getting next task for host managed_node2 30583 1726853761.81872: ^ task is: TASK: meta (role_complete) 30583 1726853761.81877: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853761.81889: getting variables 30583 1726853761.81891: in VariableManager get_vars() 30583 1726853761.81934: Calling all_inventory to load vars for managed_node2 30583 1726853761.81937: Calling groups_inventory to load vars for managed_node2 30583 1726853761.81941: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853761.81950: Calling all_plugins_play to load vars for managed_node2 30583 1726853761.81954: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853761.81956: Calling groups_plugins_play to load vars for managed_node2 30583 1726853761.83639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853761.85265: done with get_vars() 30583 1726853761.85298: done getting variables 30583 1726853761.85400: done queuing things up, now waiting for results queue to drain 30583 1726853761.85403: results queue empty 30583 1726853761.85403: checking for any_errors_fatal 30583 1726853761.85407: done checking for any_errors_fatal 30583 1726853761.85407: checking for max_fail_percentage 30583 1726853761.85409: done checking for max_fail_percentage 30583 1726853761.85409: checking to see if all hosts have failed and the running result is not ok 30583 1726853761.85410: done checking to see if all hosts have failed 30583 1726853761.85411: getting the remaining hosts for this loop 30583 1726853761.85412: done getting the remaining hosts for this loop 30583 1726853761.85415: getting the next task for host managed_node2 30583 1726853761.85421: done getting next task for host managed_node2 30583 1726853761.85425: ^ task is: TASK: Test 30583 1726853761.85427: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853761.85430: getting variables 30583 1726853761.85431: in VariableManager get_vars() 30583 1726853761.85446: Calling all_inventory to load vars for managed_node2 30583 1726853761.85455: Calling groups_inventory to load vars for managed_node2 30583 1726853761.85458: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853761.85464: Calling all_plugins_play to load vars for managed_node2 30583 1726853761.85466: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853761.85469: Calling groups_plugins_play to load vars for managed_node2 30583 1726853761.86682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853761.88497: done with get_vars() 30583 1726853761.88521: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 13:36:01 -0400 (0:00:00.426) 0:01:37.223 ****** 30583 1726853761.88602: entering _queue_task() for managed_node2/include_tasks 30583 1726853761.89076: worker is 1 (out of 1 available) 30583 1726853761.89089: exiting _queue_task() for managed_node2/include_tasks 30583 1726853761.89101: done queuing things up, now waiting for results queue to drain 30583 1726853761.89103: waiting for pending results... 30583 1726853761.89373: running TaskExecutor() for managed_node2/TASK: Test 30583 1726853761.89487: in run() - task 02083763-bbaf-05ea-abc5-000000001748 30583 1726853761.89501: variable 'ansible_search_path' from source: unknown 30583 1726853761.89510: variable 'ansible_search_path' from source: unknown 30583 1726853761.89564: variable 'lsr_test' from source: include params 30583 1726853761.89799: variable 'lsr_test' from source: include params 30583 1726853761.89873: variable 'omit' from source: magic vars 30583 1726853761.90132: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853761.90136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853761.90139: variable 'omit' from source: magic vars 30583 1726853761.90303: variable 'ansible_distribution_major_version' from source: facts 30583 1726853761.90367: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853761.90372: variable 'item' from source: unknown 30583 1726853761.90394: variable 'item' from source: unknown 30583 1726853761.90433: variable 'item' from source: unknown 30583 1726853761.90509: variable 'item' from source: unknown 30583 1726853761.90739: dumping result to json 30583 1726853761.90742: done dumping result, returning 30583 1726853761.90752: done running TaskExecutor() for managed_node2/TASK: Test [02083763-bbaf-05ea-abc5-000000001748] 30583 1726853761.90754: sending task result for task 02083763-bbaf-05ea-abc5-000000001748 30583 1726853761.90798: done sending task result for task 02083763-bbaf-05ea-abc5-000000001748 30583 1726853761.90801: WORKER PROCESS EXITING 30583 1726853761.90877: no more pending results, returning what we have 30583 1726853761.90882: in VariableManager get_vars() 30583 1726853761.90923: Calling all_inventory to load vars for managed_node2 30583 1726853761.90926: Calling groups_inventory to load vars for managed_node2 30583 1726853761.90930: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853761.90941: Calling all_plugins_play to load vars for managed_node2 30583 1726853761.90944: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853761.90947: Calling groups_plugins_play to load vars for managed_node2 30583 1726853761.92423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853761.94078: done with get_vars() 30583 1726853761.94113: variable 'ansible_search_path' from source: unknown 30583 1726853761.94115: variable 'ansible_search_path' from source: unknown 30583 1726853761.94157: we have included files to process 30583 1726853761.94158: generating all_blocks data 30583 1726853761.94160: done generating all_blocks data 30583 1726853761.94165: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30583 1726853761.94166: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30583 1726853761.94169: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30583 1726853761.94375: done processing included file 30583 1726853761.94377: iterating over new_blocks loaded from include file 30583 1726853761.94378: in VariableManager get_vars() 30583 1726853761.94396: done with get_vars() 30583 1726853761.94398: filtering new block on tags 30583 1726853761.94433: done filtering new block on tags 30583 1726853761.94436: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node2 => (item=tasks/remove+down_profile.yml) 30583 1726853761.94441: extending task lists for all hosts with included blocks 30583 1726853761.95423: done extending task lists 30583 1726853761.95424: done processing included files 30583 1726853761.95425: results queue empty 30583 1726853761.95426: checking for any_errors_fatal 30583 1726853761.95428: done checking for any_errors_fatal 30583 1726853761.95429: checking for max_fail_percentage 30583 1726853761.95430: done checking for max_fail_percentage 30583 1726853761.95431: checking to see if all hosts have failed and the running result is not ok 30583 1726853761.95431: done checking to see if all hosts have failed 30583 1726853761.95432: getting the remaining hosts for this loop 30583 1726853761.95433: done getting the remaining hosts for this loop 30583 1726853761.95437: getting the next task for host managed_node2 30583 1726853761.95441: done getting next task for host managed_node2 30583 1726853761.95443: ^ task is: TASK: Include network role 30583 1726853761.95446: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853761.95449: getting variables 30583 1726853761.95450: in VariableManager get_vars() 30583 1726853761.95464: Calling all_inventory to load vars for managed_node2 30583 1726853761.95466: Calling groups_inventory to load vars for managed_node2 30583 1726853761.95469: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853761.95476: Calling all_plugins_play to load vars for managed_node2 30583 1726853761.95479: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853761.95489: Calling groups_plugins_play to load vars for managed_node2 30583 1726853761.96865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853761.98460: done with get_vars() 30583 1726853761.98498: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 13:36:01 -0400 (0:00:00.100) 0:01:37.323 ****** 30583 1726853761.98610: entering _queue_task() for managed_node2/include_role 30583 1726853761.99015: worker is 1 (out of 1 available) 30583 1726853761.99028: exiting _queue_task() for managed_node2/include_role 30583 1726853761.99043: done queuing things up, now waiting for results queue to drain 30583 1726853761.99044: waiting for pending results... 30583 1726853761.99356: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853761.99563: in run() - task 02083763-bbaf-05ea-abc5-000000001ca9 30583 1726853761.99568: variable 'ansible_search_path' from source: unknown 30583 1726853761.99572: variable 'ansible_search_path' from source: unknown 30583 1726853761.99576: calling self._execute() 30583 1726853761.99673: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853761.99687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853761.99702: variable 'omit' from source: magic vars 30583 1726853762.00248: variable 'ansible_distribution_major_version' from source: facts 30583 1726853762.00272: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853762.00476: _execute() done 30583 1726853762.00480: dumping result to json 30583 1726853762.00483: done dumping result, returning 30583 1726853762.00485: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-000000001ca9] 30583 1726853762.00488: sending task result for task 02083763-bbaf-05ea-abc5-000000001ca9 30583 1726853762.00577: done sending task result for task 02083763-bbaf-05ea-abc5-000000001ca9 30583 1726853762.00581: WORKER PROCESS EXITING 30583 1726853762.00613: no more pending results, returning what we have 30583 1726853762.00618: in VariableManager get_vars() 30583 1726853762.00663: Calling all_inventory to load vars for managed_node2 30583 1726853762.00666: Calling groups_inventory to load vars for managed_node2 30583 1726853762.00669: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853762.00682: Calling all_plugins_play to load vars for managed_node2 30583 1726853762.00685: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853762.00687: Calling groups_plugins_play to load vars for managed_node2 30583 1726853762.02046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853762.03592: done with get_vars() 30583 1726853762.03620: variable 'ansible_search_path' from source: unknown 30583 1726853762.03621: variable 'ansible_search_path' from source: unknown 30583 1726853762.03757: variable 'omit' from source: magic vars 30583 1726853762.03801: variable 'omit' from source: magic vars 30583 1726853762.03815: variable 'omit' from source: magic vars 30583 1726853762.03819: we have included files to process 30583 1726853762.03820: generating all_blocks data 30583 1726853762.03821: done generating all_blocks data 30583 1726853762.03822: processing included file: fedora.linux_system_roles.network 30583 1726853762.03844: in VariableManager get_vars() 30583 1726853762.03865: done with get_vars() 30583 1726853762.03897: in VariableManager get_vars() 30583 1726853762.03920: done with get_vars() 30583 1726853762.03963: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853762.04092: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853762.04177: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853762.04746: in VariableManager get_vars() 30583 1726853762.04772: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853762.06690: iterating over new_blocks loaded from include file 30583 1726853762.06692: in VariableManager get_vars() 30583 1726853762.06710: done with get_vars() 30583 1726853762.06712: filtering new block on tags 30583 1726853762.07003: done filtering new block on tags 30583 1726853762.07007: in VariableManager get_vars() 30583 1726853762.07024: done with get_vars() 30583 1726853762.07026: filtering new block on tags 30583 1726853762.07044: done filtering new block on tags 30583 1726853762.07046: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853762.07052: extending task lists for all hosts with included blocks 30583 1726853762.07167: done extending task lists 30583 1726853762.07168: done processing included files 30583 1726853762.07169: results queue empty 30583 1726853762.07170: checking for any_errors_fatal 30583 1726853762.07176: done checking for any_errors_fatal 30583 1726853762.07177: checking for max_fail_percentage 30583 1726853762.07178: done checking for max_fail_percentage 30583 1726853762.07179: checking to see if all hosts have failed and the running result is not ok 30583 1726853762.07180: done checking to see if all hosts have failed 30583 1726853762.07180: getting the remaining hosts for this loop 30583 1726853762.07182: done getting the remaining hosts for this loop 30583 1726853762.07185: getting the next task for host managed_node2 30583 1726853762.07190: done getting next task for host managed_node2 30583 1726853762.07193: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853762.07196: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853762.07207: getting variables 30583 1726853762.07208: in VariableManager get_vars() 30583 1726853762.07222: Calling all_inventory to load vars for managed_node2 30583 1726853762.07224: Calling groups_inventory to load vars for managed_node2 30583 1726853762.07226: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853762.07232: Calling all_plugins_play to load vars for managed_node2 30583 1726853762.07234: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853762.07237: Calling groups_plugins_play to load vars for managed_node2 30583 1726853762.08479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853762.11026: done with get_vars() 30583 1726853762.11064: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:36:02 -0400 (0:00:00.125) 0:01:37.449 ****** 30583 1726853762.11206: entering _queue_task() for managed_node2/include_tasks 30583 1726853762.11591: worker is 1 (out of 1 available) 30583 1726853762.11606: exiting _queue_task() for managed_node2/include_tasks 30583 1726853762.11619: done queuing things up, now waiting for results queue to drain 30583 1726853762.11620: waiting for pending results... 30583 1726853762.11913: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853762.12123: in run() - task 02083763-bbaf-05ea-abc5-000000001d2b 30583 1726853762.12144: variable 'ansible_search_path' from source: unknown 30583 1726853762.12154: variable 'ansible_search_path' from source: unknown 30583 1726853762.12201: calling self._execute() 30583 1726853762.12329: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853762.12341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853762.12357: variable 'omit' from source: magic vars 30583 1726853762.12785: variable 'ansible_distribution_major_version' from source: facts 30583 1726853762.12803: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853762.12977: _execute() done 30583 1726853762.12982: dumping result to json 30583 1726853762.12985: done dumping result, returning 30583 1726853762.12988: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-000000001d2b] 30583 1726853762.12991: sending task result for task 02083763-bbaf-05ea-abc5-000000001d2b 30583 1726853762.13075: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d2b 30583 1726853762.13078: WORKER PROCESS EXITING 30583 1726853762.13136: no more pending results, returning what we have 30583 1726853762.13142: in VariableManager get_vars() 30583 1726853762.13227: Calling all_inventory to load vars for managed_node2 30583 1726853762.13230: Calling groups_inventory to load vars for managed_node2 30583 1726853762.13233: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853762.13245: Calling all_plugins_play to load vars for managed_node2 30583 1726853762.13249: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853762.13253: Calling groups_plugins_play to load vars for managed_node2 30583 1726853762.16294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853762.19150: done with get_vars() 30583 1726853762.19185: variable 'ansible_search_path' from source: unknown 30583 1726853762.19187: variable 'ansible_search_path' from source: unknown 30583 1726853762.19227: we have included files to process 30583 1726853762.19229: generating all_blocks data 30583 1726853762.19231: done generating all_blocks data 30583 1726853762.19234: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853762.19235: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853762.19237: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853762.19836: done processing included file 30583 1726853762.19838: iterating over new_blocks loaded from include file 30583 1726853762.19839: in VariableManager get_vars() 30583 1726853762.19868: done with get_vars() 30583 1726853762.19869: filtering new block on tags 30583 1726853762.19898: done filtering new block on tags 30583 1726853762.19901: in VariableManager get_vars() 30583 1726853762.19923: done with get_vars() 30583 1726853762.19925: filtering new block on tags 30583 1726853762.19976: done filtering new block on tags 30583 1726853762.19979: in VariableManager get_vars() 30583 1726853762.20003: done with get_vars() 30583 1726853762.20004: filtering new block on tags 30583 1726853762.20040: done filtering new block on tags 30583 1726853762.20042: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853762.20047: extending task lists for all hosts with included blocks 30583 1726853762.22057: done extending task lists 30583 1726853762.22058: done processing included files 30583 1726853762.22059: results queue empty 30583 1726853762.22060: checking for any_errors_fatal 30583 1726853762.22062: done checking for any_errors_fatal 30583 1726853762.22063: checking for max_fail_percentage 30583 1726853762.22064: done checking for max_fail_percentage 30583 1726853762.22065: checking to see if all hosts have failed and the running result is not ok 30583 1726853762.22066: done checking to see if all hosts have failed 30583 1726853762.22067: getting the remaining hosts for this loop 30583 1726853762.22069: done getting the remaining hosts for this loop 30583 1726853762.22081: getting the next task for host managed_node2 30583 1726853762.22093: done getting next task for host managed_node2 30583 1726853762.22096: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853762.22101: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853762.22114: getting variables 30583 1726853762.22115: in VariableManager get_vars() 30583 1726853762.22132: Calling all_inventory to load vars for managed_node2 30583 1726853762.22135: Calling groups_inventory to load vars for managed_node2 30583 1726853762.22137: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853762.22142: Calling all_plugins_play to load vars for managed_node2 30583 1726853762.22145: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853762.22148: Calling groups_plugins_play to load vars for managed_node2 30583 1726853762.23523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853762.26063: done with get_vars() 30583 1726853762.26258: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:36:02 -0400 (0:00:00.151) 0:01:37.600 ****** 30583 1726853762.26464: entering _queue_task() for managed_node2/setup 30583 1726853762.27256: worker is 1 (out of 1 available) 30583 1726853762.27267: exiting _queue_task() for managed_node2/setup 30583 1726853762.27282: done queuing things up, now waiting for results queue to drain 30583 1726853762.27284: waiting for pending results... 30583 1726853762.27715: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853762.27720: in run() - task 02083763-bbaf-05ea-abc5-000000001d82 30583 1726853762.27723: variable 'ansible_search_path' from source: unknown 30583 1726853762.27726: variable 'ansible_search_path' from source: unknown 30583 1726853762.28078: calling self._execute() 30583 1726853762.28083: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853762.28085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853762.28088: variable 'omit' from source: magic vars 30583 1726853762.28417: variable 'ansible_distribution_major_version' from source: facts 30583 1726853762.28552: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853762.29077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853762.31732: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853762.31918: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853762.31961: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853762.31992: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853762.32017: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853762.32423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853762.32464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853762.32485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853762.32528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853762.32540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853762.32588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853762.32608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853762.32639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853762.32675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853762.32689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853762.32902: variable '__network_required_facts' from source: role '' defaults 30583 1726853762.32920: variable 'ansible_facts' from source: unknown 30583 1726853762.33710: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853762.33720: when evaluation is False, skipping this task 30583 1726853762.33729: _execute() done 30583 1726853762.33736: dumping result to json 30583 1726853762.33743: done dumping result, returning 30583 1726853762.33756: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-000000001d82] 30583 1726853762.33770: sending task result for task 02083763-bbaf-05ea-abc5-000000001d82 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853762.33933: no more pending results, returning what we have 30583 1726853762.33937: results queue empty 30583 1726853762.33938: checking for any_errors_fatal 30583 1726853762.33940: done checking for any_errors_fatal 30583 1726853762.33940: checking for max_fail_percentage 30583 1726853762.33942: done checking for max_fail_percentage 30583 1726853762.33943: checking to see if all hosts have failed and the running result is not ok 30583 1726853762.33944: done checking to see if all hosts have failed 30583 1726853762.33944: getting the remaining hosts for this loop 30583 1726853762.33946: done getting the remaining hosts for this loop 30583 1726853762.33950: getting the next task for host managed_node2 30583 1726853762.33963: done getting next task for host managed_node2 30583 1726853762.33966: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853762.33973: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853762.34004: getting variables 30583 1726853762.34006: in VariableManager get_vars() 30583 1726853762.34057: Calling all_inventory to load vars for managed_node2 30583 1726853762.34060: Calling groups_inventory to load vars for managed_node2 30583 1726853762.34062: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853762.34074: Calling all_plugins_play to load vars for managed_node2 30583 1726853762.34078: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853762.34081: Calling groups_plugins_play to load vars for managed_node2 30583 1726853762.34830: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d82 30583 1726853762.34839: WORKER PROCESS EXITING 30583 1726853762.35919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853762.37566: done with get_vars() 30583 1726853762.37596: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:36:02 -0400 (0:00:00.113) 0:01:37.714 ****** 30583 1726853762.37709: entering _queue_task() for managed_node2/stat 30583 1726853762.38089: worker is 1 (out of 1 available) 30583 1726853762.38101: exiting _queue_task() for managed_node2/stat 30583 1726853762.38113: done queuing things up, now waiting for results queue to drain 30583 1726853762.38114: waiting for pending results... 30583 1726853762.38500: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853762.38614: in run() - task 02083763-bbaf-05ea-abc5-000000001d84 30583 1726853762.38631: variable 'ansible_search_path' from source: unknown 30583 1726853762.38640: variable 'ansible_search_path' from source: unknown 30583 1726853762.38686: calling self._execute() 30583 1726853762.38795: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853762.38812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853762.38828: variable 'omit' from source: magic vars 30583 1726853762.39222: variable 'ansible_distribution_major_version' from source: facts 30583 1726853762.39245: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853762.39428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853762.39776: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853762.39781: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853762.39815: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853762.39860: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853762.39955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853762.40005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853762.40027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853762.40112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853762.40156: variable '__network_is_ostree' from source: set_fact 30583 1726853762.40173: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853762.40182: when evaluation is False, skipping this task 30583 1726853762.40189: _execute() done 30583 1726853762.40196: dumping result to json 30583 1726853762.40203: done dumping result, returning 30583 1726853762.40218: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-000000001d84] 30583 1726853762.40230: sending task result for task 02083763-bbaf-05ea-abc5-000000001d84 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853762.40526: no more pending results, returning what we have 30583 1726853762.40531: results queue empty 30583 1726853762.40532: checking for any_errors_fatal 30583 1726853762.40540: done checking for any_errors_fatal 30583 1726853762.40541: checking for max_fail_percentage 30583 1726853762.40543: done checking for max_fail_percentage 30583 1726853762.40544: checking to see if all hosts have failed and the running result is not ok 30583 1726853762.40545: done checking to see if all hosts have failed 30583 1726853762.40546: getting the remaining hosts for this loop 30583 1726853762.40548: done getting the remaining hosts for this loop 30583 1726853762.40551: getting the next task for host managed_node2 30583 1726853762.40562: done getting next task for host managed_node2 30583 1726853762.40566: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853762.40573: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853762.40599: getting variables 30583 1726853762.40600: in VariableManager get_vars() 30583 1726853762.40645: Calling all_inventory to load vars for managed_node2 30583 1726853762.40648: Calling groups_inventory to load vars for managed_node2 30583 1726853762.40650: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853762.40662: Calling all_plugins_play to load vars for managed_node2 30583 1726853762.40666: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853762.40669: Calling groups_plugins_play to load vars for managed_node2 30583 1726853762.40787: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d84 30583 1726853762.40790: WORKER PROCESS EXITING 30583 1726853762.42602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853762.44463: done with get_vars() 30583 1726853762.44495: done getting variables 30583 1726853762.44564: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:36:02 -0400 (0:00:00.068) 0:01:37.783 ****** 30583 1726853762.44609: entering _queue_task() for managed_node2/set_fact 30583 1726853762.45186: worker is 1 (out of 1 available) 30583 1726853762.45197: exiting _queue_task() for managed_node2/set_fact 30583 1726853762.45209: done queuing things up, now waiting for results queue to drain 30583 1726853762.45210: waiting for pending results... 30583 1726853762.45353: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853762.45677: in run() - task 02083763-bbaf-05ea-abc5-000000001d85 30583 1726853762.45682: variable 'ansible_search_path' from source: unknown 30583 1726853762.45685: variable 'ansible_search_path' from source: unknown 30583 1726853762.45688: calling self._execute() 30583 1726853762.45691: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853762.45694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853762.45697: variable 'omit' from source: magic vars 30583 1726853762.46095: variable 'ansible_distribution_major_version' from source: facts 30583 1726853762.46107: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853762.46284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853762.46568: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853762.46617: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853762.46661: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853762.46697: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853762.46808: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853762.46828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853762.46864: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853762.46890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853762.46992: variable '__network_is_ostree' from source: set_fact 30583 1726853762.46995: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853762.46998: when evaluation is False, skipping this task 30583 1726853762.47003: _execute() done 30583 1726853762.47006: dumping result to json 30583 1726853762.47008: done dumping result, returning 30583 1726853762.47017: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-000000001d85] 30583 1726853762.47020: sending task result for task 02083763-bbaf-05ea-abc5-000000001d85 30583 1726853762.47121: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d85 30583 1726853762.47125: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853762.47182: no more pending results, returning what we have 30583 1726853762.47186: results queue empty 30583 1726853762.47187: checking for any_errors_fatal 30583 1726853762.47196: done checking for any_errors_fatal 30583 1726853762.47197: checking for max_fail_percentage 30583 1726853762.47199: done checking for max_fail_percentage 30583 1726853762.47201: checking to see if all hosts have failed and the running result is not ok 30583 1726853762.47201: done checking to see if all hosts have failed 30583 1726853762.47202: getting the remaining hosts for this loop 30583 1726853762.47204: done getting the remaining hosts for this loop 30583 1726853762.47208: getting the next task for host managed_node2 30583 1726853762.47221: done getting next task for host managed_node2 30583 1726853762.47225: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853762.47231: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853762.47261: getting variables 30583 1726853762.47263: in VariableManager get_vars() 30583 1726853762.47313: Calling all_inventory to load vars for managed_node2 30583 1726853762.47317: Calling groups_inventory to load vars for managed_node2 30583 1726853762.47319: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853762.47330: Calling all_plugins_play to load vars for managed_node2 30583 1726853762.47334: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853762.47337: Calling groups_plugins_play to load vars for managed_node2 30583 1726853762.49025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853762.50700: done with get_vars() 30583 1726853762.50730: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:36:02 -0400 (0:00:00.062) 0:01:37.845 ****** 30583 1726853762.50843: entering _queue_task() for managed_node2/service_facts 30583 1726853762.51247: worker is 1 (out of 1 available) 30583 1726853762.51261: exiting _queue_task() for managed_node2/service_facts 30583 1726853762.51405: done queuing things up, now waiting for results queue to drain 30583 1726853762.51407: waiting for pending results... 30583 1726853762.51791: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853762.51797: in run() - task 02083763-bbaf-05ea-abc5-000000001d87 30583 1726853762.51800: variable 'ansible_search_path' from source: unknown 30583 1726853762.51803: variable 'ansible_search_path' from source: unknown 30583 1726853762.51806: calling self._execute() 30583 1726853762.51906: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853762.51910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853762.51920: variable 'omit' from source: magic vars 30583 1726853762.52355: variable 'ansible_distribution_major_version' from source: facts 30583 1726853762.52383: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853762.52388: variable 'omit' from source: magic vars 30583 1726853762.52465: variable 'omit' from source: magic vars 30583 1726853762.52511: variable 'omit' from source: magic vars 30583 1726853762.52555: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853762.52591: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853762.52621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853762.52640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853762.52652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853762.52686: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853762.52690: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853762.52693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853762.52805: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853762.52811: Set connection var ansible_timeout to 10 30583 1726853762.52813: Set connection var ansible_connection to ssh 30583 1726853762.52831: Set connection var ansible_shell_executable to /bin/sh 30583 1726853762.52834: Set connection var ansible_shell_type to sh 30583 1726853762.52844: Set connection var ansible_pipelining to False 30583 1726853762.52869: variable 'ansible_shell_executable' from source: unknown 30583 1726853762.52874: variable 'ansible_connection' from source: unknown 30583 1726853762.52877: variable 'ansible_module_compression' from source: unknown 30583 1726853762.52879: variable 'ansible_shell_type' from source: unknown 30583 1726853762.52882: variable 'ansible_shell_executable' from source: unknown 30583 1726853762.52884: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853762.52886: variable 'ansible_pipelining' from source: unknown 30583 1726853762.52890: variable 'ansible_timeout' from source: unknown 30583 1726853762.52893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853762.53117: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853762.53122: variable 'omit' from source: magic vars 30583 1726853762.53125: starting attempt loop 30583 1726853762.53127: running the handler 30583 1726853762.53142: _low_level_execute_command(): starting 30583 1726853762.53162: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853762.54061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853762.54067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853762.54073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853762.54076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853762.54078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853762.54170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853762.56278: stdout chunk (state=3): >>>/root <<< 30583 1726853762.56283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853762.56285: stdout chunk (state=3): >>><<< 30583 1726853762.56287: stderr chunk (state=3): >>><<< 30583 1726853762.56290: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853762.56293: _low_level_execute_command(): starting 30583 1726853762.56296: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793 `" && echo ansible-tmp-1726853762.5620475-35190-177171704235793="` echo /root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793 `" ) && sleep 0' 30583 1726853762.57311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853762.57315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853762.57475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853762.57589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853762.57687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853762.59792: stdout chunk (state=3): >>>ansible-tmp-1726853762.5620475-35190-177171704235793=/root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793 <<< 30583 1726853762.59874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853762.59878: stderr chunk (state=3): >>><<< 30583 1726853762.59881: stdout chunk (state=3): >>><<< 30583 1726853762.59902: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853762.5620475-35190-177171704235793=/root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853762.59966: variable 'ansible_module_compression' from source: unknown 30583 1726853762.60014: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853762.60064: variable 'ansible_facts' from source: unknown 30583 1726853762.60162: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793/AnsiballZ_service_facts.py 30583 1726853762.60398: Sending initial data 30583 1726853762.60401: Sent initial data (162 bytes) 30583 1726853762.60918: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853762.60932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853762.60948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853762.60989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853762.61087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853762.61117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853762.61217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853762.63266: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853762.63286: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853762.63345: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpdo3crxcy /root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793/AnsiballZ_service_facts.py <<< 30583 1726853762.63349: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793/AnsiballZ_service_facts.py" <<< 30583 1726853762.63419: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpdo3crxcy" to remote "/root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793/AnsiballZ_service_facts.py" <<< 30583 1726853762.64729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853762.64875: stderr chunk (state=3): >>><<< 30583 1726853762.64878: stdout chunk (state=3): >>><<< 30583 1726853762.64891: done transferring module to remote 30583 1726853762.64905: _low_level_execute_command(): starting 30583 1726853762.64912: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793/ /root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793/AnsiballZ_service_facts.py && sleep 0' 30583 1726853762.65717: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853762.65939: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853762.66132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853762.66199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853762.66733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853762.68687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853762.68691: stdout chunk (state=3): >>><<< 30583 1726853762.68697: stderr chunk (state=3): >>><<< 30583 1726853762.68747: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853762.68778: _low_level_execute_command(): starting 30583 1726853762.68782: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793/AnsiballZ_service_facts.py && sleep 0' 30583 1726853762.69968: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853762.69975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853762.69994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853762.70002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853762.70014: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853762.70020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853762.70144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853762.70219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853764.37980: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 30583 1726853764.38003: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 30583 1726853764.38037: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 30583 1726853764.38077: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 30583 1726853764.38098: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853764.39878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853764.39884: stdout chunk (state=3): >>><<< 30583 1726853764.39887: stderr chunk (state=3): >>><<< 30583 1726853764.39894: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853764.42684: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853764.42689: _low_level_execute_command(): starting 30583 1726853764.42691: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853762.5620475-35190-177171704235793/ > /dev/null 2>&1 && sleep 0' 30583 1726853764.43992: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853764.44104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853764.44299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853764.44344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853764.44439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853764.46592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853764.46596: stdout chunk (state=3): >>><<< 30583 1726853764.46601: stderr chunk (state=3): >>><<< 30583 1726853764.46619: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853764.46625: handler run complete 30583 1726853764.47185: variable 'ansible_facts' from source: unknown 30583 1726853764.47713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853764.49390: variable 'ansible_facts' from source: unknown 30583 1726853764.49530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853764.50157: attempt loop complete, returning result 30583 1726853764.50165: _execute() done 30583 1726853764.50168: dumping result to json 30583 1726853764.50680: done dumping result, returning 30583 1726853764.50683: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-000000001d87] 30583 1726853764.50685: sending task result for task 02083763-bbaf-05ea-abc5-000000001d87 30583 1726853764.53338: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d87 30583 1726853764.53342: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853764.53461: no more pending results, returning what we have 30583 1726853764.53464: results queue empty 30583 1726853764.53465: checking for any_errors_fatal 30583 1726853764.53468: done checking for any_errors_fatal 30583 1726853764.53469: checking for max_fail_percentage 30583 1726853764.53473: done checking for max_fail_percentage 30583 1726853764.53474: checking to see if all hosts have failed and the running result is not ok 30583 1726853764.53475: done checking to see if all hosts have failed 30583 1726853764.53476: getting the remaining hosts for this loop 30583 1726853764.53477: done getting the remaining hosts for this loop 30583 1726853764.53481: getting the next task for host managed_node2 30583 1726853764.53487: done getting next task for host managed_node2 30583 1726853764.53491: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853764.53497: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853764.53508: getting variables 30583 1726853764.53510: in VariableManager get_vars() 30583 1726853764.53540: Calling all_inventory to load vars for managed_node2 30583 1726853764.53543: Calling groups_inventory to load vars for managed_node2 30583 1726853764.53546: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853764.53554: Calling all_plugins_play to load vars for managed_node2 30583 1726853764.53558: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853764.53561: Calling groups_plugins_play to load vars for managed_node2 30583 1726853764.56134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853764.58763: done with get_vars() 30583 1726853764.58796: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:36:04 -0400 (0:00:02.080) 0:01:39.926 ****** 30583 1726853764.58899: entering _queue_task() for managed_node2/package_facts 30583 1726853764.59438: worker is 1 (out of 1 available) 30583 1726853764.59452: exiting _queue_task() for managed_node2/package_facts 30583 1726853764.59464: done queuing things up, now waiting for results queue to drain 30583 1726853764.59465: waiting for pending results... 30583 1726853764.59901: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853764.59906: in run() - task 02083763-bbaf-05ea-abc5-000000001d88 30583 1726853764.59927: variable 'ansible_search_path' from source: unknown 30583 1726853764.59937: variable 'ansible_search_path' from source: unknown 30583 1726853764.60082: calling self._execute() 30583 1726853764.60153: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853764.60217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853764.60262: variable 'omit' from source: magic vars 30583 1726853764.61111: variable 'ansible_distribution_major_version' from source: facts 30583 1726853764.61322: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853764.61325: variable 'omit' from source: magic vars 30583 1726853764.61327: variable 'omit' from source: magic vars 30583 1726853764.61329: variable 'omit' from source: magic vars 30583 1726853764.61468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853764.61688: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853764.61715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853764.61737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853764.61821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853764.61863: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853764.61933: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853764.61962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853764.62097: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853764.62222: Set connection var ansible_timeout to 10 30583 1726853764.62225: Set connection var ansible_connection to ssh 30583 1726853764.62228: Set connection var ansible_shell_executable to /bin/sh 30583 1726853764.62230: Set connection var ansible_shell_type to sh 30583 1726853764.62232: Set connection var ansible_pipelining to False 30583 1726853764.62234: variable 'ansible_shell_executable' from source: unknown 30583 1726853764.62237: variable 'ansible_connection' from source: unknown 30583 1726853764.62239: variable 'ansible_module_compression' from source: unknown 30583 1726853764.62241: variable 'ansible_shell_type' from source: unknown 30583 1726853764.62243: variable 'ansible_shell_executable' from source: unknown 30583 1726853764.62245: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853764.62247: variable 'ansible_pipelining' from source: unknown 30583 1726853764.62249: variable 'ansible_timeout' from source: unknown 30583 1726853764.62251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853764.62450: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853764.62466: variable 'omit' from source: magic vars 30583 1726853764.62478: starting attempt loop 30583 1726853764.62484: running the handler 30583 1726853764.62501: _low_level_execute_command(): starting 30583 1726853764.62517: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853764.63235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853764.63289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853764.63306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853764.63392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853764.63429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853764.63549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853764.65293: stdout chunk (state=3): >>>/root <<< 30583 1726853764.65347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853764.65483: stderr chunk (state=3): >>><<< 30583 1726853764.65493: stdout chunk (state=3): >>><<< 30583 1726853764.65526: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853764.65687: _low_level_execute_command(): starting 30583 1726853764.65691: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932 `" && echo ansible-tmp-1726853764.6557531-35319-75736853812932="` echo /root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932 `" ) && sleep 0' 30583 1726853764.66426: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853764.66459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853764.66566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853764.68614: stdout chunk (state=3): >>>ansible-tmp-1726853764.6557531-35319-75736853812932=/root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932 <<< 30583 1726853764.68854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853764.68857: stdout chunk (state=3): >>><<< 30583 1726853764.68860: stderr chunk (state=3): >>><<< 30583 1726853764.69077: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853764.6557531-35319-75736853812932=/root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853764.69080: variable 'ansible_module_compression' from source: unknown 30583 1726853764.69082: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853764.69085: variable 'ansible_facts' from source: unknown 30583 1726853764.69283: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932/AnsiballZ_package_facts.py 30583 1726853764.69434: Sending initial data 30583 1726853764.69445: Sent initial data (161 bytes) 30583 1726853764.70432: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853764.70588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853764.70696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853764.70736: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853764.70806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853764.72506: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30583 1726853764.72596: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853764.72691: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853764.72886: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp2nmia8sy /root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932/AnsiballZ_package_facts.py <<< 30583 1726853764.72906: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932/AnsiballZ_package_facts.py" <<< 30583 1726853764.73006: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp2nmia8sy" to remote "/root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932/AnsiballZ_package_facts.py" <<< 30583 1726853764.74660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853764.74676: stderr chunk (state=3): >>><<< 30583 1726853764.74832: stdout chunk (state=3): >>><<< 30583 1726853764.74835: done transferring module to remote 30583 1726853764.74837: _low_level_execute_command(): starting 30583 1726853764.74839: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932/ /root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932/AnsiballZ_package_facts.py && sleep 0' 30583 1726853764.75954: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853764.75961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853764.75964: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853764.75989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853764.76099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853764.78135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853764.78138: stdout chunk (state=3): >>><<< 30583 1726853764.78141: stderr chunk (state=3): >>><<< 30583 1726853764.78291: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853764.78307: _low_level_execute_command(): starting 30583 1726853764.78311: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932/AnsiballZ_package_facts.py && sleep 0' 30583 1726853764.79570: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853764.79577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853764.79580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853764.79648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853764.79884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853764.79987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853764.80004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853764.80195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853765.25647: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30583 1726853765.25695: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30583 1726853765.25703: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30583 1726853765.25706: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30583 1726853765.25805: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30583 1726853765.25817: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30583 1726853765.25822: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30583 1726853765.25857: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853765.27670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853765.27676: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853765.27697: stderr chunk (state=3): >>><<< 30583 1726853765.27700: stdout chunk (state=3): >>><<< 30583 1726853765.27734: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853765.34586: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853765.34600: _low_level_execute_command(): starting 30583 1726853765.34603: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853764.6557531-35319-75736853812932/ > /dev/null 2>&1 && sleep 0' 30583 1726853765.35079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853765.35083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853765.35085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853765.35089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853765.35091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853765.35142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853765.35146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853765.35148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853765.35228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853765.37208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853765.37212: stdout chunk (state=3): >>><<< 30583 1726853765.37215: stderr chunk (state=3): >>><<< 30583 1726853765.37231: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853765.37380: handler run complete 30583 1726853765.37976: variable 'ansible_facts' from source: unknown 30583 1726853765.38284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853765.39338: variable 'ansible_facts' from source: unknown 30583 1726853765.39574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853765.39953: attempt loop complete, returning result 30583 1726853765.39964: _execute() done 30583 1726853765.39967: dumping result to json 30583 1726853765.40139: done dumping result, returning 30583 1726853765.40145: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-000000001d88] 30583 1726853765.40148: sending task result for task 02083763-bbaf-05ea-abc5-000000001d88 30583 1726853765.45851: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d88 30583 1726853765.45855: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853765.45905: no more pending results, returning what we have 30583 1726853765.45907: results queue empty 30583 1726853765.45907: checking for any_errors_fatal 30583 1726853765.45910: done checking for any_errors_fatal 30583 1726853765.45910: checking for max_fail_percentage 30583 1726853765.45911: done checking for max_fail_percentage 30583 1726853765.45912: checking to see if all hosts have failed and the running result is not ok 30583 1726853765.45912: done checking to see if all hosts have failed 30583 1726853765.45913: getting the remaining hosts for this loop 30583 1726853765.45914: done getting the remaining hosts for this loop 30583 1726853765.45915: getting the next task for host managed_node2 30583 1726853765.45919: done getting next task for host managed_node2 30583 1726853765.45921: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853765.45925: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853765.45931: getting variables 30583 1726853765.45932: in VariableManager get_vars() 30583 1726853765.45945: Calling all_inventory to load vars for managed_node2 30583 1726853765.45947: Calling groups_inventory to load vars for managed_node2 30583 1726853765.45948: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853765.45952: Calling all_plugins_play to load vars for managed_node2 30583 1726853765.45954: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853765.45955: Calling groups_plugins_play to load vars for managed_node2 30583 1726853765.46588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853765.47446: done with get_vars() 30583 1726853765.47467: done getting variables 30583 1726853765.47507: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:36:05 -0400 (0:00:00.886) 0:01:40.812 ****** 30583 1726853765.47528: entering _queue_task() for managed_node2/debug 30583 1726853765.47804: worker is 1 (out of 1 available) 30583 1726853765.47818: exiting _queue_task() for managed_node2/debug 30583 1726853765.47830: done queuing things up, now waiting for results queue to drain 30583 1726853765.47831: waiting for pending results... 30583 1726853765.48028: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853765.48134: in run() - task 02083763-bbaf-05ea-abc5-000000001d2c 30583 1726853765.48145: variable 'ansible_search_path' from source: unknown 30583 1726853765.48149: variable 'ansible_search_path' from source: unknown 30583 1726853765.48185: calling self._execute() 30583 1726853765.48255: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853765.48259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853765.48272: variable 'omit' from source: magic vars 30583 1726853765.48566: variable 'ansible_distribution_major_version' from source: facts 30583 1726853765.48577: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853765.48583: variable 'omit' from source: magic vars 30583 1726853765.48627: variable 'omit' from source: magic vars 30583 1726853765.48702: variable 'network_provider' from source: set_fact 30583 1726853765.48722: variable 'omit' from source: magic vars 30583 1726853765.48752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853765.48783: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853765.48800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853765.48812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853765.48830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853765.48848: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853765.48851: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853765.48854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853765.48923: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853765.48929: Set connection var ansible_timeout to 10 30583 1726853765.48933: Set connection var ansible_connection to ssh 30583 1726853765.48935: Set connection var ansible_shell_executable to /bin/sh 30583 1726853765.48938: Set connection var ansible_shell_type to sh 30583 1726853765.48949: Set connection var ansible_pipelining to False 30583 1726853765.48970: variable 'ansible_shell_executable' from source: unknown 30583 1726853765.48975: variable 'ansible_connection' from source: unknown 30583 1726853765.48978: variable 'ansible_module_compression' from source: unknown 30583 1726853765.48980: variable 'ansible_shell_type' from source: unknown 30583 1726853765.48983: variable 'ansible_shell_executable' from source: unknown 30583 1726853765.48985: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853765.48987: variable 'ansible_pipelining' from source: unknown 30583 1726853765.48989: variable 'ansible_timeout' from source: unknown 30583 1726853765.48992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853765.49097: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853765.49108: variable 'omit' from source: magic vars 30583 1726853765.49113: starting attempt loop 30583 1726853765.49116: running the handler 30583 1726853765.49151: handler run complete 30583 1726853765.49376: attempt loop complete, returning result 30583 1726853765.49380: _execute() done 30583 1726853765.49382: dumping result to json 30583 1726853765.49385: done dumping result, returning 30583 1726853765.49388: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-000000001d2c] 30583 1726853765.49390: sending task result for task 02083763-bbaf-05ea-abc5-000000001d2c 30583 1726853765.49460: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d2c 30583 1726853765.49465: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853765.49719: no more pending results, returning what we have 30583 1726853765.49722: results queue empty 30583 1726853765.49723: checking for any_errors_fatal 30583 1726853765.49730: done checking for any_errors_fatal 30583 1726853765.49731: checking for max_fail_percentage 30583 1726853765.49733: done checking for max_fail_percentage 30583 1726853765.49734: checking to see if all hosts have failed and the running result is not ok 30583 1726853765.49735: done checking to see if all hosts have failed 30583 1726853765.49735: getting the remaining hosts for this loop 30583 1726853765.49737: done getting the remaining hosts for this loop 30583 1726853765.49740: getting the next task for host managed_node2 30583 1726853765.49747: done getting next task for host managed_node2 30583 1726853765.49750: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853765.49755: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853765.49769: getting variables 30583 1726853765.49770: in VariableManager get_vars() 30583 1726853765.49806: Calling all_inventory to load vars for managed_node2 30583 1726853765.49809: Calling groups_inventory to load vars for managed_node2 30583 1726853765.49811: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853765.49820: Calling all_plugins_play to load vars for managed_node2 30583 1726853765.49823: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853765.49825: Calling groups_plugins_play to load vars for managed_node2 30583 1726853765.50868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853765.51830: done with get_vars() 30583 1726853765.51849: done getting variables 30583 1726853765.51898: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:36:05 -0400 (0:00:00.043) 0:01:40.856 ****** 30583 1726853765.51929: entering _queue_task() for managed_node2/fail 30583 1726853765.52226: worker is 1 (out of 1 available) 30583 1726853765.52240: exiting _queue_task() for managed_node2/fail 30583 1726853765.52252: done queuing things up, now waiting for results queue to drain 30583 1726853765.52253: waiting for pending results... 30583 1726853765.52630: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853765.52874: in run() - task 02083763-bbaf-05ea-abc5-000000001d2d 30583 1726853765.52925: variable 'ansible_search_path' from source: unknown 30583 1726853765.52934: variable 'ansible_search_path' from source: unknown 30583 1726853765.52984: calling self._execute() 30583 1726853765.53090: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853765.53100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853765.53111: variable 'omit' from source: magic vars 30583 1726853765.53630: variable 'ansible_distribution_major_version' from source: facts 30583 1726853765.53674: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853765.53912: variable 'network_state' from source: role '' defaults 30583 1726853765.53928: Evaluated conditional (network_state != {}): False 30583 1726853765.53936: when evaluation is False, skipping this task 30583 1726853765.53975: _execute() done 30583 1726853765.54086: dumping result to json 30583 1726853765.54091: done dumping result, returning 30583 1726853765.54094: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-000000001d2d] 30583 1726853765.54097: sending task result for task 02083763-bbaf-05ea-abc5-000000001d2d 30583 1726853765.54184: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d2d 30583 1726853765.54197: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853765.54292: no more pending results, returning what we have 30583 1726853765.54297: results queue empty 30583 1726853765.54298: checking for any_errors_fatal 30583 1726853765.54306: done checking for any_errors_fatal 30583 1726853765.54307: checking for max_fail_percentage 30583 1726853765.54310: done checking for max_fail_percentage 30583 1726853765.54311: checking to see if all hosts have failed and the running result is not ok 30583 1726853765.54312: done checking to see if all hosts have failed 30583 1726853765.54312: getting the remaining hosts for this loop 30583 1726853765.54314: done getting the remaining hosts for this loop 30583 1726853765.54318: getting the next task for host managed_node2 30583 1726853765.54328: done getting next task for host managed_node2 30583 1726853765.54332: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853765.54339: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853765.54376: getting variables 30583 1726853765.54378: in VariableManager get_vars() 30583 1726853765.54427: Calling all_inventory to load vars for managed_node2 30583 1726853765.54430: Calling groups_inventory to load vars for managed_node2 30583 1726853765.54433: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853765.54683: Calling all_plugins_play to load vars for managed_node2 30583 1726853765.54688: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853765.54692: Calling groups_plugins_play to load vars for managed_node2 30583 1726853765.56254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853765.58628: done with get_vars() 30583 1726853765.58661: done getting variables 30583 1726853765.58721: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:36:05 -0400 (0:00:00.068) 0:01:40.924 ****** 30583 1726853765.58757: entering _queue_task() for managed_node2/fail 30583 1726853765.59245: worker is 1 (out of 1 available) 30583 1726853765.59260: exiting _queue_task() for managed_node2/fail 30583 1726853765.59478: done queuing things up, now waiting for results queue to drain 30583 1726853765.59481: waiting for pending results... 30583 1726853765.59689: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853765.59734: in run() - task 02083763-bbaf-05ea-abc5-000000001d2e 30583 1726853765.59750: variable 'ansible_search_path' from source: unknown 30583 1726853765.59757: variable 'ansible_search_path' from source: unknown 30583 1726853765.59814: calling self._execute() 30583 1726853765.59902: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853765.60029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853765.60033: variable 'omit' from source: magic vars 30583 1726853765.60362: variable 'ansible_distribution_major_version' from source: facts 30583 1726853765.60382: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853765.60516: variable 'network_state' from source: role '' defaults 30583 1726853765.60533: Evaluated conditional (network_state != {}): False 30583 1726853765.60542: when evaluation is False, skipping this task 30583 1726853765.60550: _execute() done 30583 1726853765.60557: dumping result to json 30583 1726853765.60573: done dumping result, returning 30583 1726853765.60586: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-000000001d2e] 30583 1726853765.60596: sending task result for task 02083763-bbaf-05ea-abc5-000000001d2e skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853765.60866: no more pending results, returning what we have 30583 1726853765.60870: results queue empty 30583 1726853765.60873: checking for any_errors_fatal 30583 1726853765.60883: done checking for any_errors_fatal 30583 1726853765.60884: checking for max_fail_percentage 30583 1726853765.60886: done checking for max_fail_percentage 30583 1726853765.60887: checking to see if all hosts have failed and the running result is not ok 30583 1726853765.60888: done checking to see if all hosts have failed 30583 1726853765.60889: getting the remaining hosts for this loop 30583 1726853765.60891: done getting the remaining hosts for this loop 30583 1726853765.60895: getting the next task for host managed_node2 30583 1726853765.60904: done getting next task for host managed_node2 30583 1726853765.60909: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853765.60916: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853765.60948: getting variables 30583 1726853765.60951: in VariableManager get_vars() 30583 1726853765.61200: Calling all_inventory to load vars for managed_node2 30583 1726853765.61203: Calling groups_inventory to load vars for managed_node2 30583 1726853765.61206: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853765.61215: Calling all_plugins_play to load vars for managed_node2 30583 1726853765.61218: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853765.61222: Calling groups_plugins_play to load vars for managed_node2 30583 1726853765.61974: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d2e 30583 1726853765.61977: WORKER PROCESS EXITING 30583 1726853765.64775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853765.67451: done with get_vars() 30583 1726853765.67487: done getting variables 30583 1726853765.67547: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:36:05 -0400 (0:00:00.088) 0:01:41.013 ****** 30583 1726853765.67590: entering _queue_task() for managed_node2/fail 30583 1726853765.68408: worker is 1 (out of 1 available) 30583 1726853765.68419: exiting _queue_task() for managed_node2/fail 30583 1726853765.68430: done queuing things up, now waiting for results queue to drain 30583 1726853765.68431: waiting for pending results... 30583 1726853765.69161: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853765.69770: in run() - task 02083763-bbaf-05ea-abc5-000000001d2f 30583 1726853765.69947: variable 'ansible_search_path' from source: unknown 30583 1726853765.69952: variable 'ansible_search_path' from source: unknown 30583 1726853765.69994: calling self._execute() 30583 1726853765.70319: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853765.70325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853765.70335: variable 'omit' from source: magic vars 30583 1726853765.71826: variable 'ansible_distribution_major_version' from source: facts 30583 1726853765.71838: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853765.72218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853765.77536: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853765.77689: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853765.77928: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853765.77932: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853765.77934: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853765.78092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853765.78131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853765.78363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853765.78367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853765.78370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853765.78517: variable 'ansible_distribution_major_version' from source: facts 30583 1726853765.78534: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853765.78793: variable 'ansible_distribution' from source: facts 30583 1726853765.78796: variable '__network_rh_distros' from source: role '' defaults 30583 1726853765.78799: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853765.79352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853765.79444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853765.79473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853765.79605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853765.79620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853765.79843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853765.79865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853765.80076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853765.80100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853765.80115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853765.80160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853765.80182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853765.80205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853765.80244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853765.80260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853765.81211: variable 'network_connections' from source: include params 30583 1726853765.81223: variable 'interface' from source: play vars 30583 1726853765.81329: variable 'interface' from source: play vars 30583 1726853765.81336: variable 'network_state' from source: role '' defaults 30583 1726853765.81528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853765.81976: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853765.82093: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853765.82102: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853765.82377: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853765.82381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853765.82383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853765.82394: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853765.82397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853765.82528: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853765.82532: when evaluation is False, skipping this task 30583 1726853765.82534: _execute() done 30583 1726853765.82537: dumping result to json 30583 1726853765.82539: done dumping result, returning 30583 1726853765.82542: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-000000001d2f] 30583 1726853765.82545: sending task result for task 02083763-bbaf-05ea-abc5-000000001d2f skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853765.82829: no more pending results, returning what we have 30583 1726853765.82833: results queue empty 30583 1726853765.82834: checking for any_errors_fatal 30583 1726853765.82844: done checking for any_errors_fatal 30583 1726853765.82844: checking for max_fail_percentage 30583 1726853765.82847: done checking for max_fail_percentage 30583 1726853765.82848: checking to see if all hosts have failed and the running result is not ok 30583 1726853765.82849: done checking to see if all hosts have failed 30583 1726853765.82850: getting the remaining hosts for this loop 30583 1726853765.82852: done getting the remaining hosts for this loop 30583 1726853765.82856: getting the next task for host managed_node2 30583 1726853765.82867: done getting next task for host managed_node2 30583 1726853765.82873: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853765.82879: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853765.82909: getting variables 30583 1726853765.82911: in VariableManager get_vars() 30583 1726853765.82961: Calling all_inventory to load vars for managed_node2 30583 1726853765.82964: Calling groups_inventory to load vars for managed_node2 30583 1726853765.82966: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853765.83289: Calling all_plugins_play to load vars for managed_node2 30583 1726853765.83293: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853765.83297: Calling groups_plugins_play to load vars for managed_node2 30583 1726853765.83885: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d2f 30583 1726853765.83889: WORKER PROCESS EXITING 30583 1726853765.86464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853765.90693: done with get_vars() 30583 1726853765.90727: done getting variables 30583 1726853765.90792: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:36:05 -0400 (0:00:00.232) 0:01:41.245 ****** 30583 1726853765.90835: entering _queue_task() for managed_node2/dnf 30583 1726853765.91238: worker is 1 (out of 1 available) 30583 1726853765.91367: exiting _queue_task() for managed_node2/dnf 30583 1726853765.91382: done queuing things up, now waiting for results queue to drain 30583 1726853765.91383: waiting for pending results... 30583 1726853765.91606: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853765.91751: in run() - task 02083763-bbaf-05ea-abc5-000000001d30 30583 1726853765.91775: variable 'ansible_search_path' from source: unknown 30583 1726853765.91793: variable 'ansible_search_path' from source: unknown 30583 1726853765.91843: calling self._execute() 30583 1726853765.91965: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853765.91979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853765.91994: variable 'omit' from source: magic vars 30583 1726853765.92907: variable 'ansible_distribution_major_version' from source: facts 30583 1726853765.92911: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853765.93301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853765.98965: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853765.99168: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853765.99384: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853765.99387: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853765.99390: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853765.99622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853765.99689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853765.99787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853765.99842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853765.99875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.00035: variable 'ansible_distribution' from source: facts 30583 1726853766.00045: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.00087: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853766.00226: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853766.00392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.00432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.00468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.00533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.00563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.00619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.00650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.00687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.00736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.00756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.00837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.00849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.00884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.00946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.00954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.01166: variable 'network_connections' from source: include params 30583 1726853766.01170: variable 'interface' from source: play vars 30583 1726853766.01234: variable 'interface' from source: play vars 30583 1726853766.01356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853766.01604: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853766.01634: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853766.01725: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853766.01729: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853766.01786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853766.01819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853766.01867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.01931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853766.01985: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853766.02345: variable 'network_connections' from source: include params 30583 1726853766.02366: variable 'interface' from source: play vars 30583 1726853766.02479: variable 'interface' from source: play vars 30583 1726853766.02483: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853766.02486: when evaluation is False, skipping this task 30583 1726853766.02488: _execute() done 30583 1726853766.02490: dumping result to json 30583 1726853766.02498: done dumping result, returning 30583 1726853766.02537: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001d30] 30583 1726853766.02540: sending task result for task 02083763-bbaf-05ea-abc5-000000001d30 30583 1726853766.02832: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d30 30583 1726853766.02836: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853766.02916: no more pending results, returning what we have 30583 1726853766.02920: results queue empty 30583 1726853766.02921: checking for any_errors_fatal 30583 1726853766.02929: done checking for any_errors_fatal 30583 1726853766.02930: checking for max_fail_percentage 30583 1726853766.02932: done checking for max_fail_percentage 30583 1726853766.02933: checking to see if all hosts have failed and the running result is not ok 30583 1726853766.02934: done checking to see if all hosts have failed 30583 1726853766.02934: getting the remaining hosts for this loop 30583 1726853766.02936: done getting the remaining hosts for this loop 30583 1726853766.02940: getting the next task for host managed_node2 30583 1726853766.02955: done getting next task for host managed_node2 30583 1726853766.02963: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853766.02974: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853766.03002: getting variables 30583 1726853766.03006: in VariableManager get_vars() 30583 1726853766.03276: Calling all_inventory to load vars for managed_node2 30583 1726853766.03279: Calling groups_inventory to load vars for managed_node2 30583 1726853766.03282: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853766.03378: Calling all_plugins_play to load vars for managed_node2 30583 1726853766.03382: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853766.03387: Calling groups_plugins_play to load vars for managed_node2 30583 1726853766.05393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853766.07651: done with get_vars() 30583 1726853766.07679: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853766.07733: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:36:06 -0400 (0:00:00.169) 0:01:41.414 ****** 30583 1726853766.07761: entering _queue_task() for managed_node2/yum 30583 1726853766.08031: worker is 1 (out of 1 available) 30583 1726853766.08044: exiting _queue_task() for managed_node2/yum 30583 1726853766.08060: done queuing things up, now waiting for results queue to drain 30583 1726853766.08062: waiting for pending results... 30583 1726853766.08262: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853766.08361: in run() - task 02083763-bbaf-05ea-abc5-000000001d31 30583 1726853766.08376: variable 'ansible_search_path' from source: unknown 30583 1726853766.08379: variable 'ansible_search_path' from source: unknown 30583 1726853766.08412: calling self._execute() 30583 1726853766.08491: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853766.08495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853766.08507: variable 'omit' from source: magic vars 30583 1726853766.08805: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.08813: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853766.08937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853766.12002: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853766.12051: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853766.12083: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853766.12110: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853766.12131: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853766.12196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.12538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.12558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.12590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.12601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.12678: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.12693: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853766.12696: when evaluation is False, skipping this task 30583 1726853766.12700: _execute() done 30583 1726853766.12703: dumping result to json 30583 1726853766.12706: done dumping result, returning 30583 1726853766.12716: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001d31] 30583 1726853766.12719: sending task result for task 02083763-bbaf-05ea-abc5-000000001d31 30583 1726853766.12811: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d31 30583 1726853766.12814: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853766.12872: no more pending results, returning what we have 30583 1726853766.12876: results queue empty 30583 1726853766.12877: checking for any_errors_fatal 30583 1726853766.12885: done checking for any_errors_fatal 30583 1726853766.12885: checking for max_fail_percentage 30583 1726853766.12887: done checking for max_fail_percentage 30583 1726853766.12888: checking to see if all hosts have failed and the running result is not ok 30583 1726853766.12889: done checking to see if all hosts have failed 30583 1726853766.12890: getting the remaining hosts for this loop 30583 1726853766.12891: done getting the remaining hosts for this loop 30583 1726853766.12895: getting the next task for host managed_node2 30583 1726853766.12904: done getting next task for host managed_node2 30583 1726853766.12908: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853766.12912: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853766.12940: getting variables 30583 1726853766.12942: in VariableManager get_vars() 30583 1726853766.12986: Calling all_inventory to load vars for managed_node2 30583 1726853766.12989: Calling groups_inventory to load vars for managed_node2 30583 1726853766.12991: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853766.12999: Calling all_plugins_play to load vars for managed_node2 30583 1726853766.13002: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853766.13004: Calling groups_plugins_play to load vars for managed_node2 30583 1726853766.14361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853766.16249: done with get_vars() 30583 1726853766.16286: done getting variables 30583 1726853766.16362: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:36:06 -0400 (0:00:00.086) 0:01:41.501 ****** 30583 1726853766.16409: entering _queue_task() for managed_node2/fail 30583 1726853766.16680: worker is 1 (out of 1 available) 30583 1726853766.16693: exiting _queue_task() for managed_node2/fail 30583 1726853766.16706: done queuing things up, now waiting for results queue to drain 30583 1726853766.16708: waiting for pending results... 30583 1726853766.16909: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853766.17054: in run() - task 02083763-bbaf-05ea-abc5-000000001d32 30583 1726853766.17058: variable 'ansible_search_path' from source: unknown 30583 1726853766.17061: variable 'ansible_search_path' from source: unknown 30583 1726853766.17130: calling self._execute() 30583 1726853766.17228: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853766.17232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853766.17266: variable 'omit' from source: magic vars 30583 1726853766.18048: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.18054: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853766.18425: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853766.18682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853766.20732: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853766.20802: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853766.20843: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853766.20880: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853766.20899: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853766.20992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.21021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.21043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.21077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.21106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.21129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.21157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.21186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.21215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.21226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.21273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.21304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.21322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.21347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.21373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.21516: variable 'network_connections' from source: include params 30583 1726853766.21527: variable 'interface' from source: play vars 30583 1726853766.21608: variable 'interface' from source: play vars 30583 1726853766.21667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853766.21798: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853766.21825: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853766.21860: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853766.21884: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853766.21911: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853766.21926: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853766.21945: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.21963: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853766.22013: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853766.22208: variable 'network_connections' from source: include params 30583 1726853766.22213: variable 'interface' from source: play vars 30583 1726853766.22260: variable 'interface' from source: play vars 30583 1726853766.22280: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853766.22283: when evaluation is False, skipping this task 30583 1726853766.22286: _execute() done 30583 1726853766.22288: dumping result to json 30583 1726853766.22290: done dumping result, returning 30583 1726853766.22297: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001d32] 30583 1726853766.22301: sending task result for task 02083763-bbaf-05ea-abc5-000000001d32 30583 1726853766.22451: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d32 30583 1726853766.22454: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853766.22513: no more pending results, returning what we have 30583 1726853766.22516: results queue empty 30583 1726853766.22517: checking for any_errors_fatal 30583 1726853766.22524: done checking for any_errors_fatal 30583 1726853766.22524: checking for max_fail_percentage 30583 1726853766.22526: done checking for max_fail_percentage 30583 1726853766.22527: checking to see if all hosts have failed and the running result is not ok 30583 1726853766.22528: done checking to see if all hosts have failed 30583 1726853766.22528: getting the remaining hosts for this loop 30583 1726853766.22530: done getting the remaining hosts for this loop 30583 1726853766.22534: getting the next task for host managed_node2 30583 1726853766.22542: done getting next task for host managed_node2 30583 1726853766.22545: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853766.22550: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853766.22580: getting variables 30583 1726853766.22582: in VariableManager get_vars() 30583 1726853766.22626: Calling all_inventory to load vars for managed_node2 30583 1726853766.22628: Calling groups_inventory to load vars for managed_node2 30583 1726853766.22630: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853766.22639: Calling all_plugins_play to load vars for managed_node2 30583 1726853766.22641: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853766.22644: Calling groups_plugins_play to load vars for managed_node2 30583 1726853766.23626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853766.24617: done with get_vars() 30583 1726853766.24637: done getting variables 30583 1726853766.24681: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:36:06 -0400 (0:00:00.083) 0:01:41.584 ****** 30583 1726853766.24711: entering _queue_task() for managed_node2/package 30583 1726853766.24970: worker is 1 (out of 1 available) 30583 1726853766.24987: exiting _queue_task() for managed_node2/package 30583 1726853766.24998: done queuing things up, now waiting for results queue to drain 30583 1726853766.24999: waiting for pending results... 30583 1726853766.25200: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853766.25301: in run() - task 02083763-bbaf-05ea-abc5-000000001d33 30583 1726853766.25311: variable 'ansible_search_path' from source: unknown 30583 1726853766.25314: variable 'ansible_search_path' from source: unknown 30583 1726853766.25345: calling self._execute() 30583 1726853766.25426: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853766.25430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853766.25438: variable 'omit' from source: magic vars 30583 1726853766.25740: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.25749: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853766.25895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853766.26086: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853766.26122: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853766.26148: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853766.26226: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853766.26299: variable 'network_packages' from source: role '' defaults 30583 1726853766.26389: variable '__network_provider_setup' from source: role '' defaults 30583 1726853766.26392: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853766.26445: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853766.26448: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853766.26494: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853766.26609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853766.28353: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853766.28357: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853766.28365: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853766.28520: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853766.28523: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853766.28526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.28682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.28686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.28689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.28693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.28718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.28753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.28786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.28811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.28834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.29027: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853766.29211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.29217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.29220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.29247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.29253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.29323: variable 'ansible_python' from source: facts 30583 1726853766.29425: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853766.29428: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853766.29525: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853766.29596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.29613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.29630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.29681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.29740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.29750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.29772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.29877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.29881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.29883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.29976: variable 'network_connections' from source: include params 30583 1726853766.29982: variable 'interface' from source: play vars 30583 1726853766.30063: variable 'interface' from source: play vars 30583 1726853766.30197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853766.30200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853766.30206: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.30228: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853766.30269: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853766.30515: variable 'network_connections' from source: include params 30583 1726853766.30525: variable 'interface' from source: play vars 30583 1726853766.30670: variable 'interface' from source: play vars 30583 1726853766.30675: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853766.30790: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853766.31075: variable 'network_connections' from source: include params 30583 1726853766.31078: variable 'interface' from source: play vars 30583 1726853766.31107: variable 'interface' from source: play vars 30583 1726853766.31123: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853766.31269: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853766.31440: variable 'network_connections' from source: include params 30583 1726853766.31451: variable 'interface' from source: play vars 30583 1726853766.31493: variable 'interface' from source: play vars 30583 1726853766.31530: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853766.31583: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853766.31589: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853766.31637: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853766.31935: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853766.32255: variable 'network_connections' from source: include params 30583 1726853766.32261: variable 'interface' from source: play vars 30583 1726853766.32303: variable 'interface' from source: play vars 30583 1726853766.32330: variable 'ansible_distribution' from source: facts 30583 1726853766.32333: variable '__network_rh_distros' from source: role '' defaults 30583 1726853766.32338: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.32340: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853766.32517: variable 'ansible_distribution' from source: facts 30583 1726853766.32520: variable '__network_rh_distros' from source: role '' defaults 30583 1726853766.32522: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.32524: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853766.32634: variable 'ansible_distribution' from source: facts 30583 1726853766.32638: variable '__network_rh_distros' from source: role '' defaults 30583 1726853766.32640: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.32675: variable 'network_provider' from source: set_fact 30583 1726853766.32689: variable 'ansible_facts' from source: unknown 30583 1726853766.33170: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853766.33175: when evaluation is False, skipping this task 30583 1726853766.33177: _execute() done 30583 1726853766.33180: dumping result to json 30583 1726853766.33182: done dumping result, returning 30583 1726853766.33190: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-000000001d33] 30583 1726853766.33194: sending task result for task 02083763-bbaf-05ea-abc5-000000001d33 30583 1726853766.33298: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d33 30583 1726853766.33301: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853766.33346: no more pending results, returning what we have 30583 1726853766.33349: results queue empty 30583 1726853766.33350: checking for any_errors_fatal 30583 1726853766.33360: done checking for any_errors_fatal 30583 1726853766.33360: checking for max_fail_percentage 30583 1726853766.33363: done checking for max_fail_percentage 30583 1726853766.33363: checking to see if all hosts have failed and the running result is not ok 30583 1726853766.33364: done checking to see if all hosts have failed 30583 1726853766.33365: getting the remaining hosts for this loop 30583 1726853766.33367: done getting the remaining hosts for this loop 30583 1726853766.33372: getting the next task for host managed_node2 30583 1726853766.33380: done getting next task for host managed_node2 30583 1726853766.33383: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853766.33388: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853766.33414: getting variables 30583 1726853766.33415: in VariableManager get_vars() 30583 1726853766.33466: Calling all_inventory to load vars for managed_node2 30583 1726853766.33468: Calling groups_inventory to load vars for managed_node2 30583 1726853766.33499: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853766.33510: Calling all_plugins_play to load vars for managed_node2 30583 1726853766.33512: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853766.33515: Calling groups_plugins_play to load vars for managed_node2 30583 1726853766.34529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853766.35577: done with get_vars() 30583 1726853766.35601: done getting variables 30583 1726853766.35667: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:36:06 -0400 (0:00:00.109) 0:01:41.694 ****** 30583 1726853766.35696: entering _queue_task() for managed_node2/package 30583 1726853766.35973: worker is 1 (out of 1 available) 30583 1726853766.35988: exiting _queue_task() for managed_node2/package 30583 1726853766.36001: done queuing things up, now waiting for results queue to drain 30583 1726853766.36003: waiting for pending results... 30583 1726853766.36216: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853766.36318: in run() - task 02083763-bbaf-05ea-abc5-000000001d34 30583 1726853766.36329: variable 'ansible_search_path' from source: unknown 30583 1726853766.36340: variable 'ansible_search_path' from source: unknown 30583 1726853766.36369: calling self._execute() 30583 1726853766.36448: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853766.36453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853766.36465: variable 'omit' from source: magic vars 30583 1726853766.36759: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.36775: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853766.36855: variable 'network_state' from source: role '' defaults 30583 1726853766.36865: Evaluated conditional (network_state != {}): False 30583 1726853766.36870: when evaluation is False, skipping this task 30583 1726853766.36889: _execute() done 30583 1726853766.36906: dumping result to json 30583 1726853766.36909: done dumping result, returning 30583 1726853766.36913: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000001d34] 30583 1726853766.36915: sending task result for task 02083763-bbaf-05ea-abc5-000000001d34 30583 1726853766.37008: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d34 30583 1726853766.37011: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853766.37056: no more pending results, returning what we have 30583 1726853766.37062: results queue empty 30583 1726853766.37063: checking for any_errors_fatal 30583 1726853766.37070: done checking for any_errors_fatal 30583 1726853766.37072: checking for max_fail_percentage 30583 1726853766.37075: done checking for max_fail_percentage 30583 1726853766.37076: checking to see if all hosts have failed and the running result is not ok 30583 1726853766.37076: done checking to see if all hosts have failed 30583 1726853766.37077: getting the remaining hosts for this loop 30583 1726853766.37079: done getting the remaining hosts for this loop 30583 1726853766.37082: getting the next task for host managed_node2 30583 1726853766.37091: done getting next task for host managed_node2 30583 1726853766.37094: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853766.37099: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853766.37128: getting variables 30583 1726853766.37129: in VariableManager get_vars() 30583 1726853766.37177: Calling all_inventory to load vars for managed_node2 30583 1726853766.37180: Calling groups_inventory to load vars for managed_node2 30583 1726853766.37183: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853766.37192: Calling all_plugins_play to load vars for managed_node2 30583 1726853766.37195: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853766.37197: Calling groups_plugins_play to load vars for managed_node2 30583 1726853766.38400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853766.40180: done with get_vars() 30583 1726853766.40197: done getting variables 30583 1726853766.40251: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:36:06 -0400 (0:00:00.045) 0:01:41.740 ****** 30583 1726853766.40282: entering _queue_task() for managed_node2/package 30583 1726853766.40576: worker is 1 (out of 1 available) 30583 1726853766.40588: exiting _queue_task() for managed_node2/package 30583 1726853766.40601: done queuing things up, now waiting for results queue to drain 30583 1726853766.40602: waiting for pending results... 30583 1726853766.40845: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853766.40975: in run() - task 02083763-bbaf-05ea-abc5-000000001d35 30583 1726853766.41009: variable 'ansible_search_path' from source: unknown 30583 1726853766.41012: variable 'ansible_search_path' from source: unknown 30583 1726853766.41049: calling self._execute() 30583 1726853766.41145: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853766.41148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853766.41151: variable 'omit' from source: magic vars 30583 1726853766.41545: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.41612: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853766.41670: variable 'network_state' from source: role '' defaults 30583 1726853766.41697: Evaluated conditional (network_state != {}): False 30583 1726853766.41701: when evaluation is False, skipping this task 30583 1726853766.41704: _execute() done 30583 1726853766.41706: dumping result to json 30583 1726853766.41709: done dumping result, returning 30583 1726853766.41712: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000001d35] 30583 1726853766.41714: sending task result for task 02083763-bbaf-05ea-abc5-000000001d35 30583 1726853766.41812: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d35 30583 1726853766.41815: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853766.41864: no more pending results, returning what we have 30583 1726853766.41868: results queue empty 30583 1726853766.41869: checking for any_errors_fatal 30583 1726853766.41883: done checking for any_errors_fatal 30583 1726853766.41884: checking for max_fail_percentage 30583 1726853766.41887: done checking for max_fail_percentage 30583 1726853766.41888: checking to see if all hosts have failed and the running result is not ok 30583 1726853766.41888: done checking to see if all hosts have failed 30583 1726853766.41889: getting the remaining hosts for this loop 30583 1726853766.41891: done getting the remaining hosts for this loop 30583 1726853766.41894: getting the next task for host managed_node2 30583 1726853766.41902: done getting next task for host managed_node2 30583 1726853766.41905: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853766.41910: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853766.41940: getting variables 30583 1726853766.41945: in VariableManager get_vars() 30583 1726853766.41986: Calling all_inventory to load vars for managed_node2 30583 1726853766.41989: Calling groups_inventory to load vars for managed_node2 30583 1726853766.41991: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853766.42002: Calling all_plugins_play to load vars for managed_node2 30583 1726853766.42005: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853766.42011: Calling groups_plugins_play to load vars for managed_node2 30583 1726853766.43434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853766.44663: done with get_vars() 30583 1726853766.44686: done getting variables 30583 1726853766.44728: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:36:06 -0400 (0:00:00.044) 0:01:41.784 ****** 30583 1726853766.44755: entering _queue_task() for managed_node2/service 30583 1726853766.45024: worker is 1 (out of 1 available) 30583 1726853766.45039: exiting _queue_task() for managed_node2/service 30583 1726853766.45051: done queuing things up, now waiting for results queue to drain 30583 1726853766.45052: waiting for pending results... 30583 1726853766.45245: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853766.45350: in run() - task 02083763-bbaf-05ea-abc5-000000001d36 30583 1726853766.45364: variable 'ansible_search_path' from source: unknown 30583 1726853766.45367: variable 'ansible_search_path' from source: unknown 30583 1726853766.45397: calling self._execute() 30583 1726853766.45473: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853766.45477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853766.45486: variable 'omit' from source: magic vars 30583 1726853766.45784: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.45793: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853766.45880: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853766.46015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853766.48163: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853766.48167: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853766.48170: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853766.48189: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853766.48221: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853766.48331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.48339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.48365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.48406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.48424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.48560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.48564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.48566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.48568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.48575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.48678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.48686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.48689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.48691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.48694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.48912: variable 'network_connections' from source: include params 30583 1726853766.48915: variable 'interface' from source: play vars 30583 1726853766.48949: variable 'interface' from source: play vars 30583 1726853766.49126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853766.49197: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853766.49238: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853766.49267: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853766.49391: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853766.49394: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853766.49397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853766.49399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.49421: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853766.49469: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853766.49690: variable 'network_connections' from source: include params 30583 1726853766.49696: variable 'interface' from source: play vars 30583 1726853766.49760: variable 'interface' from source: play vars 30583 1726853766.49808: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853766.49880: when evaluation is False, skipping this task 30583 1726853766.49882: _execute() done 30583 1726853766.49884: dumping result to json 30583 1726853766.49886: done dumping result, returning 30583 1726853766.49888: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000001d36] 30583 1726853766.49889: sending task result for task 02083763-bbaf-05ea-abc5-000000001d36 30583 1726853766.50011: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d36 30583 1726853766.50029: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853766.50124: no more pending results, returning what we have 30583 1726853766.50128: results queue empty 30583 1726853766.50129: checking for any_errors_fatal 30583 1726853766.50137: done checking for any_errors_fatal 30583 1726853766.50138: checking for max_fail_percentage 30583 1726853766.50141: done checking for max_fail_percentage 30583 1726853766.50142: checking to see if all hosts have failed and the running result is not ok 30583 1726853766.50143: done checking to see if all hosts have failed 30583 1726853766.50143: getting the remaining hosts for this loop 30583 1726853766.50148: done getting the remaining hosts for this loop 30583 1726853766.50152: getting the next task for host managed_node2 30583 1726853766.50206: done getting next task for host managed_node2 30583 1726853766.50212: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853766.50217: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853766.50244: getting variables 30583 1726853766.50246: in VariableManager get_vars() 30583 1726853766.50328: Calling all_inventory to load vars for managed_node2 30583 1726853766.50331: Calling groups_inventory to load vars for managed_node2 30583 1726853766.50334: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853766.50344: Calling all_plugins_play to load vars for managed_node2 30583 1726853766.50348: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853766.50350: Calling groups_plugins_play to load vars for managed_node2 30583 1726853766.52262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853766.53238: done with get_vars() 30583 1726853766.53267: done getting variables 30583 1726853766.53313: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:36:06 -0400 (0:00:00.085) 0:01:41.870 ****** 30583 1726853766.53338: entering _queue_task() for managed_node2/service 30583 1726853766.53601: worker is 1 (out of 1 available) 30583 1726853766.53615: exiting _queue_task() for managed_node2/service 30583 1726853766.53628: done queuing things up, now waiting for results queue to drain 30583 1726853766.53630: waiting for pending results... 30583 1726853766.53885: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853766.54005: in run() - task 02083763-bbaf-05ea-abc5-000000001d37 30583 1726853766.54010: variable 'ansible_search_path' from source: unknown 30583 1726853766.54013: variable 'ansible_search_path' from source: unknown 30583 1726853766.54052: calling self._execute() 30583 1726853766.54149: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853766.54156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853766.54164: variable 'omit' from source: magic vars 30583 1726853766.54484: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.54493: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853766.54607: variable 'network_provider' from source: set_fact 30583 1726853766.54610: variable 'network_state' from source: role '' defaults 30583 1726853766.54620: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853766.54625: variable 'omit' from source: magic vars 30583 1726853766.54667: variable 'omit' from source: magic vars 30583 1726853766.54691: variable 'network_service_name' from source: role '' defaults 30583 1726853766.54737: variable 'network_service_name' from source: role '' defaults 30583 1726853766.54832: variable '__network_provider_setup' from source: role '' defaults 30583 1726853766.54835: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853766.54882: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853766.54890: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853766.54949: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853766.55381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853766.57377: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853766.57431: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853766.57474: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853766.57524: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853766.57545: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853766.57611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.57631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.57648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.57677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.57714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.57738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.57754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.57775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.57800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.57810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.57982: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853766.58060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.58079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.58096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.58121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.58132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.58200: variable 'ansible_python' from source: facts 30583 1726853766.58213: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853766.58275: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853766.58328: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853766.58412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.58430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.58448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.58479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.58490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.58521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853766.58542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853766.58558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.58589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853766.58600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853766.58695: variable 'network_connections' from source: include params 30583 1726853766.58702: variable 'interface' from source: play vars 30583 1726853766.58753: variable 'interface' from source: play vars 30583 1726853766.58832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853766.58956: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853766.58996: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853766.59030: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853766.59059: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853766.59137: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853766.59158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853766.59199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853766.59381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853766.59384: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853766.59628: variable 'network_connections' from source: include params 30583 1726853766.59640: variable 'interface' from source: play vars 30583 1726853766.59730: variable 'interface' from source: play vars 30583 1726853766.59769: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853766.59875: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853766.60228: variable 'network_connections' from source: include params 30583 1726853766.60241: variable 'interface' from source: play vars 30583 1726853766.60332: variable 'interface' from source: play vars 30583 1726853766.60378: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853766.60480: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853766.60846: variable 'network_connections' from source: include params 30583 1726853766.60857: variable 'interface' from source: play vars 30583 1726853766.60948: variable 'interface' from source: play vars 30583 1726853766.61025: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853766.61094: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853766.61114: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853766.61245: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853766.61507: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853766.62109: variable 'network_connections' from source: include params 30583 1726853766.62136: variable 'interface' from source: play vars 30583 1726853766.62229: variable 'interface' from source: play vars 30583 1726853766.62238: variable 'ansible_distribution' from source: facts 30583 1726853766.62245: variable '__network_rh_distros' from source: role '' defaults 30583 1726853766.62247: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.62330: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853766.62490: variable 'ansible_distribution' from source: facts 30583 1726853766.62498: variable '__network_rh_distros' from source: role '' defaults 30583 1726853766.62563: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.62566: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853766.62713: variable 'ansible_distribution' from source: facts 30583 1726853766.62722: variable '__network_rh_distros' from source: role '' defaults 30583 1726853766.62729: variable 'ansible_distribution_major_version' from source: facts 30583 1726853766.62789: variable 'network_provider' from source: set_fact 30583 1726853766.62876: variable 'omit' from source: magic vars 30583 1726853766.62881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853766.62884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853766.62912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853766.62931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853766.62944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853766.62982: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853766.62990: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853766.63009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853766.63126: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853766.63137: Set connection var ansible_timeout to 10 30583 1726853766.63143: Set connection var ansible_connection to ssh 30583 1726853766.63152: Set connection var ansible_shell_executable to /bin/sh 30583 1726853766.63219: Set connection var ansible_shell_type to sh 30583 1726853766.63222: Set connection var ansible_pipelining to False 30583 1726853766.63225: variable 'ansible_shell_executable' from source: unknown 30583 1726853766.63227: variable 'ansible_connection' from source: unknown 30583 1726853766.63228: variable 'ansible_module_compression' from source: unknown 30583 1726853766.63230: variable 'ansible_shell_type' from source: unknown 30583 1726853766.63232: variable 'ansible_shell_executable' from source: unknown 30583 1726853766.63246: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853766.63324: variable 'ansible_pipelining' from source: unknown 30583 1726853766.63327: variable 'ansible_timeout' from source: unknown 30583 1726853766.63330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853766.63392: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853766.63411: variable 'omit' from source: magic vars 30583 1726853766.63420: starting attempt loop 30583 1726853766.63426: running the handler 30583 1726853766.63520: variable 'ansible_facts' from source: unknown 30583 1726853766.64817: _low_level_execute_command(): starting 30583 1726853766.64968: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853766.65846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853766.65870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853766.65890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853766.65954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853766.66031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853766.66057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853766.66092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853766.66192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853766.68203: stdout chunk (state=3): >>>/root <<< 30583 1726853766.68206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853766.68208: stdout chunk (state=3): >>><<< 30583 1726853766.68210: stderr chunk (state=3): >>><<< 30583 1726853766.68279: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853766.68290: _low_level_execute_command(): starting 30583 1726853766.68415: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765 `" && echo ansible-tmp-1726853766.6826782-35393-62318585764765="` echo /root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765 `" ) && sleep 0' 30583 1726853766.69373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853766.69749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853766.69867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853766.70083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853766.72018: stdout chunk (state=3): >>>ansible-tmp-1726853766.6826782-35393-62318585764765=/root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765 <<< 30583 1726853766.72122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853766.72181: stderr chunk (state=3): >>><<< 30583 1726853766.72255: stdout chunk (state=3): >>><<< 30583 1726853766.72279: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853766.6826782-35393-62318585764765=/root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853766.72313: variable 'ansible_module_compression' from source: unknown 30583 1726853766.72596: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853766.72660: variable 'ansible_facts' from source: unknown 30583 1726853766.73084: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765/AnsiballZ_systemd.py 30583 1726853766.73503: Sending initial data 30583 1726853766.73507: Sent initial data (155 bytes) 30583 1726853766.74374: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853766.74398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853766.74401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853766.74437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853766.74533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853766.74562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853766.74665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853766.76518: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853766.76604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853766.76668: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmplw8rjvvy /root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765/AnsiballZ_systemd.py <<< 30583 1726853766.76675: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765/AnsiballZ_systemd.py" <<< 30583 1726853766.76733: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmplw8rjvvy" to remote "/root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765/AnsiballZ_systemd.py" <<< 30583 1726853766.78856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853766.78946: stderr chunk (state=3): >>><<< 30583 1726853766.78955: stdout chunk (state=3): >>><<< 30583 1726853766.79020: done transferring module to remote 30583 1726853766.79039: _low_level_execute_command(): starting 30583 1726853766.79054: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765/ /root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765/AnsiballZ_systemd.py && sleep 0' 30583 1726853766.79798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853766.79838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853766.79864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853766.80001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853766.81951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853766.81978: stdout chunk (state=3): >>><<< 30583 1726853766.82020: stderr chunk (state=3): >>><<< 30583 1726853766.82127: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853766.82131: _low_level_execute_command(): starting 30583 1726853766.82133: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765/AnsiballZ_systemd.py && sleep 0' 30583 1726853766.83167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853766.83184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853766.83211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853766.83391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853766.83447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853766.83465: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853766.83522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853766.83638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853767.13681: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4657152", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304710144", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1970070000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853767.15769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853767.15811: stderr chunk (state=3): >>><<< 30583 1726853767.15815: stdout chunk (state=3): >>><<< 30583 1726853767.15836: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4657152", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304710144", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1970070000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853767.16009: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853767.16033: _low_level_execute_command(): starting 30583 1726853767.16036: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853766.6826782-35393-62318585764765/ > /dev/null 2>&1 && sleep 0' 30583 1726853767.16652: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853767.16660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853767.16666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853767.16669: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853767.16673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853767.16762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853767.16765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853767.16830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853767.18775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853767.18798: stderr chunk (state=3): >>><<< 30583 1726853767.18801: stdout chunk (state=3): >>><<< 30583 1726853767.18813: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853767.18821: handler run complete 30583 1726853767.18895: attempt loop complete, returning result 30583 1726853767.18898: _execute() done 30583 1726853767.18901: dumping result to json 30583 1726853767.18933: done dumping result, returning 30583 1726853767.18943: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-000000001d37] 30583 1726853767.18965: sending task result for task 02083763-bbaf-05ea-abc5-000000001d37 30583 1726853767.19205: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d37 30583 1726853767.19207: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853767.19275: no more pending results, returning what we have 30583 1726853767.19279: results queue empty 30583 1726853767.19280: checking for any_errors_fatal 30583 1726853767.19288: done checking for any_errors_fatal 30583 1726853767.19289: checking for max_fail_percentage 30583 1726853767.19291: done checking for max_fail_percentage 30583 1726853767.19292: checking to see if all hosts have failed and the running result is not ok 30583 1726853767.19292: done checking to see if all hosts have failed 30583 1726853767.19293: getting the remaining hosts for this loop 30583 1726853767.19295: done getting the remaining hosts for this loop 30583 1726853767.19298: getting the next task for host managed_node2 30583 1726853767.19305: done getting next task for host managed_node2 30583 1726853767.19309: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853767.19314: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853767.19328: getting variables 30583 1726853767.19330: in VariableManager get_vars() 30583 1726853767.19446: Calling all_inventory to load vars for managed_node2 30583 1726853767.19449: Calling groups_inventory to load vars for managed_node2 30583 1726853767.19452: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853767.19461: Calling all_plugins_play to load vars for managed_node2 30583 1726853767.19463: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853767.19466: Calling groups_plugins_play to load vars for managed_node2 30583 1726853767.20365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853767.21299: done with get_vars() 30583 1726853767.21321: done getting variables 30583 1726853767.21372: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:36:07 -0400 (0:00:00.680) 0:01:42.551 ****** 30583 1726853767.21403: entering _queue_task() for managed_node2/service 30583 1726853767.21691: worker is 1 (out of 1 available) 30583 1726853767.21709: exiting _queue_task() for managed_node2/service 30583 1726853767.21722: done queuing things up, now waiting for results queue to drain 30583 1726853767.21724: waiting for pending results... 30583 1726853767.21962: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853767.22129: in run() - task 02083763-bbaf-05ea-abc5-000000001d38 30583 1726853767.22133: variable 'ansible_search_path' from source: unknown 30583 1726853767.22136: variable 'ansible_search_path' from source: unknown 30583 1726853767.22210: calling self._execute() 30583 1726853767.22288: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853767.22292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853767.22295: variable 'omit' from source: magic vars 30583 1726853767.22611: variable 'ansible_distribution_major_version' from source: facts 30583 1726853767.22621: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853767.22776: variable 'network_provider' from source: set_fact 30583 1726853767.22781: Evaluated conditional (network_provider == "nm"): True 30583 1726853767.22784: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853767.22903: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853767.23002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853767.25292: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853767.25340: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853767.25370: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853767.25398: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853767.25418: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853767.25488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853767.25508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853767.25525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853767.25552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853767.25566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853767.25614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853767.25630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853767.25646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853767.25676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853767.25687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853767.25716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853767.25732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853767.25747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853767.25779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853767.25787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853767.25887: variable 'network_connections' from source: include params 30583 1726853767.25898: variable 'interface' from source: play vars 30583 1726853767.25976: variable 'interface' from source: play vars 30583 1726853767.26018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853767.26169: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853767.26203: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853767.26287: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853767.26290: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853767.26303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853767.26326: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853767.26338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853767.26403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853767.26418: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853767.26652: variable 'network_connections' from source: include params 30583 1726853767.26655: variable 'interface' from source: play vars 30583 1726853767.26724: variable 'interface' from source: play vars 30583 1726853767.26741: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853767.26786: when evaluation is False, skipping this task 30583 1726853767.26788: _execute() done 30583 1726853767.26791: dumping result to json 30583 1726853767.26793: done dumping result, returning 30583 1726853767.26798: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-000000001d38] 30583 1726853767.26813: sending task result for task 02083763-bbaf-05ea-abc5-000000001d38 30583 1726853767.26895: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d38 30583 1726853767.26901: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853767.26946: no more pending results, returning what we have 30583 1726853767.26950: results queue empty 30583 1726853767.26951: checking for any_errors_fatal 30583 1726853767.26974: done checking for any_errors_fatal 30583 1726853767.26975: checking for max_fail_percentage 30583 1726853767.26977: done checking for max_fail_percentage 30583 1726853767.26978: checking to see if all hosts have failed and the running result is not ok 30583 1726853767.26979: done checking to see if all hosts have failed 30583 1726853767.26979: getting the remaining hosts for this loop 30583 1726853767.26981: done getting the remaining hosts for this loop 30583 1726853767.26989: getting the next task for host managed_node2 30583 1726853767.27000: done getting next task for host managed_node2 30583 1726853767.27004: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853767.27009: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853767.27032: getting variables 30583 1726853767.27033: in VariableManager get_vars() 30583 1726853767.27128: Calling all_inventory to load vars for managed_node2 30583 1726853767.27130: Calling groups_inventory to load vars for managed_node2 30583 1726853767.27132: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853767.27141: Calling all_plugins_play to load vars for managed_node2 30583 1726853767.27143: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853767.27145: Calling groups_plugins_play to load vars for managed_node2 30583 1726853767.28623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853767.30127: done with get_vars() 30583 1726853767.30156: done getting variables 30583 1726853767.30219: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:36:07 -0400 (0:00:00.088) 0:01:42.639 ****** 30583 1726853767.30256: entering _queue_task() for managed_node2/service 30583 1726853767.30669: worker is 1 (out of 1 available) 30583 1726853767.30683: exiting _queue_task() for managed_node2/service 30583 1726853767.30696: done queuing things up, now waiting for results queue to drain 30583 1726853767.30698: waiting for pending results... 30583 1726853767.31196: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853767.31207: in run() - task 02083763-bbaf-05ea-abc5-000000001d39 30583 1726853767.31210: variable 'ansible_search_path' from source: unknown 30583 1726853767.31213: variable 'ansible_search_path' from source: unknown 30583 1726853767.31215: calling self._execute() 30583 1726853767.31308: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853767.31477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853767.31481: variable 'omit' from source: magic vars 30583 1726853767.31798: variable 'ansible_distribution_major_version' from source: facts 30583 1726853767.31818: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853767.31948: variable 'network_provider' from source: set_fact 30583 1726853767.31965: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853767.31978: when evaluation is False, skipping this task 30583 1726853767.31989: _execute() done 30583 1726853767.31999: dumping result to json 30583 1726853767.32007: done dumping result, returning 30583 1726853767.32021: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-000000001d39] 30583 1726853767.32030: sending task result for task 02083763-bbaf-05ea-abc5-000000001d39 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853767.32203: no more pending results, returning what we have 30583 1726853767.32208: results queue empty 30583 1726853767.32213: checking for any_errors_fatal 30583 1726853767.32223: done checking for any_errors_fatal 30583 1726853767.32223: checking for max_fail_percentage 30583 1726853767.32227: done checking for max_fail_percentage 30583 1726853767.32229: checking to see if all hosts have failed and the running result is not ok 30583 1726853767.32230: done checking to see if all hosts have failed 30583 1726853767.32231: getting the remaining hosts for this loop 30583 1726853767.32232: done getting the remaining hosts for this loop 30583 1726853767.32236: getting the next task for host managed_node2 30583 1726853767.32250: done getting next task for host managed_node2 30583 1726853767.32253: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853767.32260: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853767.32297: getting variables 30583 1726853767.32300: in VariableManager get_vars() 30583 1726853767.32353: Calling all_inventory to load vars for managed_node2 30583 1726853767.32356: Calling groups_inventory to load vars for managed_node2 30583 1726853767.32358: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853767.32524: Calling all_plugins_play to load vars for managed_node2 30583 1726853767.32529: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853767.32535: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d39 30583 1726853767.32538: WORKER PROCESS EXITING 30583 1726853767.32542: Calling groups_plugins_play to load vars for managed_node2 30583 1726853767.33970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853767.36403: done with get_vars() 30583 1726853767.36428: done getting variables 30583 1726853767.36488: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:36:07 -0400 (0:00:00.062) 0:01:42.702 ****** 30583 1726853767.36523: entering _queue_task() for managed_node2/copy 30583 1726853767.36996: worker is 1 (out of 1 available) 30583 1726853767.37011: exiting _queue_task() for managed_node2/copy 30583 1726853767.37025: done queuing things up, now waiting for results queue to drain 30583 1726853767.37027: waiting for pending results... 30583 1726853767.37370: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853767.37540: in run() - task 02083763-bbaf-05ea-abc5-000000001d3a 30583 1726853767.37564: variable 'ansible_search_path' from source: unknown 30583 1726853767.37575: variable 'ansible_search_path' from source: unknown 30583 1726853767.37623: calling self._execute() 30583 1726853767.37741: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853767.37754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853767.37773: variable 'omit' from source: magic vars 30583 1726853767.38185: variable 'ansible_distribution_major_version' from source: facts 30583 1726853767.38202: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853767.38323: variable 'network_provider' from source: set_fact 30583 1726853767.38335: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853767.38343: when evaluation is False, skipping this task 30583 1726853767.38351: _execute() done 30583 1726853767.38366: dumping result to json 30583 1726853767.38375: done dumping result, returning 30583 1726853767.38389: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-000000001d3a] 30583 1726853767.38399: sending task result for task 02083763-bbaf-05ea-abc5-000000001d3a skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853767.38549: no more pending results, returning what we have 30583 1726853767.38553: results queue empty 30583 1726853767.38554: checking for any_errors_fatal 30583 1726853767.38562: done checking for any_errors_fatal 30583 1726853767.38563: checking for max_fail_percentage 30583 1726853767.38564: done checking for max_fail_percentage 30583 1726853767.38565: checking to see if all hosts have failed and the running result is not ok 30583 1726853767.38566: done checking to see if all hosts have failed 30583 1726853767.38567: getting the remaining hosts for this loop 30583 1726853767.38569: done getting the remaining hosts for this loop 30583 1726853767.38574: getting the next task for host managed_node2 30583 1726853767.38584: done getting next task for host managed_node2 30583 1726853767.38588: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853767.38594: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853767.38621: getting variables 30583 1726853767.38623: in VariableManager get_vars() 30583 1726853767.38874: Calling all_inventory to load vars for managed_node2 30583 1726853767.38879: Calling groups_inventory to load vars for managed_node2 30583 1726853767.38882: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853767.38893: Calling all_plugins_play to load vars for managed_node2 30583 1726853767.38896: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853767.38899: Calling groups_plugins_play to load vars for managed_node2 30583 1726853767.39583: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d3a 30583 1726853767.39587: WORKER PROCESS EXITING 30583 1726853767.41383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853767.43533: done with get_vars() 30583 1726853767.43569: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:36:07 -0400 (0:00:00.071) 0:01:42.773 ****** 30583 1726853767.43669: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853767.44036: worker is 1 (out of 1 available) 30583 1726853767.44048: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853767.44065: done queuing things up, now waiting for results queue to drain 30583 1726853767.44067: waiting for pending results... 30583 1726853767.44389: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853767.44553: in run() - task 02083763-bbaf-05ea-abc5-000000001d3b 30583 1726853767.44580: variable 'ansible_search_path' from source: unknown 30583 1726853767.44587: variable 'ansible_search_path' from source: unknown 30583 1726853767.44637: calling self._execute() 30583 1726853767.44742: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853767.44754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853767.44773: variable 'omit' from source: magic vars 30583 1726853767.45179: variable 'ansible_distribution_major_version' from source: facts 30583 1726853767.45197: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853767.45209: variable 'omit' from source: magic vars 30583 1726853767.45287: variable 'omit' from source: magic vars 30583 1726853767.45448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853767.49160: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853767.49241: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853767.49286: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853767.49330: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853767.49364: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853767.49456: variable 'network_provider' from source: set_fact 30583 1726853767.49605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853767.49637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853767.49677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853767.49721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853767.49740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853767.49825: variable 'omit' from source: magic vars 30583 1726853767.49943: variable 'omit' from source: magic vars 30583 1726853767.50051: variable 'network_connections' from source: include params 30583 1726853767.50075: variable 'interface' from source: play vars 30583 1726853767.50146: variable 'interface' from source: play vars 30583 1726853767.50304: variable 'omit' from source: magic vars 30583 1726853767.50323: variable '__lsr_ansible_managed' from source: task vars 30583 1726853767.50387: variable '__lsr_ansible_managed' from source: task vars 30583 1726853767.50595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853767.50816: Loaded config def from plugin (lookup/template) 30583 1726853767.50826: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853767.50864: File lookup term: get_ansible_managed.j2 30583 1726853767.50874: variable 'ansible_search_path' from source: unknown 30583 1726853767.50885: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853767.50902: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853767.50924: variable 'ansible_search_path' from source: unknown 30583 1726853767.74425: variable 'ansible_managed' from source: unknown 30583 1726853767.74846: variable 'omit' from source: magic vars 30583 1726853767.74851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853767.74854: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853767.74856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853767.74863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853767.74885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853767.74999: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853767.75003: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853767.75007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853767.75245: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853767.75251: Set connection var ansible_timeout to 10 30583 1726853767.75254: Set connection var ansible_connection to ssh 30583 1726853767.75261: Set connection var ansible_shell_executable to /bin/sh 30583 1726853767.75264: Set connection var ansible_shell_type to sh 30583 1726853767.75273: Set connection var ansible_pipelining to False 30583 1726853767.75301: variable 'ansible_shell_executable' from source: unknown 30583 1726853767.75304: variable 'ansible_connection' from source: unknown 30583 1726853767.75307: variable 'ansible_module_compression' from source: unknown 30583 1726853767.75309: variable 'ansible_shell_type' from source: unknown 30583 1726853767.75311: variable 'ansible_shell_executable' from source: unknown 30583 1726853767.75333: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853767.75441: variable 'ansible_pipelining' from source: unknown 30583 1726853767.75444: variable 'ansible_timeout' from source: unknown 30583 1726853767.75446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853767.75634: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853767.75645: variable 'omit' from source: magic vars 30583 1726853767.75768: starting attempt loop 30583 1726853767.75774: running the handler 30583 1726853767.75783: _low_level_execute_command(): starting 30583 1726853767.75789: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853767.77322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853767.77520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853767.77559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853767.77767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853767.79524: stdout chunk (state=3): >>>/root <<< 30583 1726853767.79664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853767.79667: stdout chunk (state=3): >>><<< 30583 1726853767.79669: stderr chunk (state=3): >>><<< 30583 1726853767.79890: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853767.79903: _low_level_execute_command(): starting 30583 1726853767.79909: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020 `" && echo ansible-tmp-1726853767.7989047-35424-48530537858020="` echo /root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020 `" ) && sleep 0' 30583 1726853767.81113: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853767.81118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853767.81254: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853767.81315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853767.81318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853767.81408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853767.81583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853767.83660: stdout chunk (state=3): >>>ansible-tmp-1726853767.7989047-35424-48530537858020=/root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020 <<< 30583 1726853767.83713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853767.83805: stderr chunk (state=3): >>><<< 30583 1726853767.83808: stdout chunk (state=3): >>><<< 30583 1726853767.83903: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853767.7989047-35424-48530537858020=/root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853767.83946: variable 'ansible_module_compression' from source: unknown 30583 1726853767.83988: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853767.84099: variable 'ansible_facts' from source: unknown 30583 1726853767.84523: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020/AnsiballZ_network_connections.py 30583 1726853767.84690: Sending initial data 30583 1726853767.84699: Sent initial data (167 bytes) 30583 1726853767.85908: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853767.85917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853767.85927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853767.86016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853767.86127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853767.86234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853767.86337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853767.88014: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30583 1726853767.88021: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30583 1726853767.88044: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853767.88125: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853767.88208: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpkz55xncv /root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020/AnsiballZ_network_connections.py <<< 30583 1726853767.88211: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020/AnsiballZ_network_connections.py" <<< 30583 1726853767.88289: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpkz55xncv" to remote "/root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020/AnsiballZ_network_connections.py" <<< 30583 1726853767.89778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853767.89783: stdout chunk (state=3): >>><<< 30583 1726853767.89786: stderr chunk (state=3): >>><<< 30583 1726853767.89788: done transferring module to remote 30583 1726853767.89790: _low_level_execute_command(): starting 30583 1726853767.89792: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020/ /root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020/AnsiballZ_network_connections.py && sleep 0' 30583 1726853767.90464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853767.90536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853767.90765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853767.90768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853767.90776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853767.90873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853767.93093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853767.93097: stdout chunk (state=3): >>><<< 30583 1726853767.93099: stderr chunk (state=3): >>><<< 30583 1726853767.93190: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853767.93201: _low_level_execute_command(): starting 30583 1726853767.93204: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020/AnsiballZ_network_connections.py && sleep 0' 30583 1726853767.93874: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853767.93900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853767.93917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853767.93937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853767.94048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853768.27375: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30583 1726853768.29448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853768.29484: stderr chunk (state=3): >>><<< 30583 1726853768.29488: stdout chunk (state=3): >>><<< 30583 1726853768.29503: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853768.29530: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853768.29537: _low_level_execute_command(): starting 30583 1726853768.29542: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853767.7989047-35424-48530537858020/ > /dev/null 2>&1 && sleep 0' 30583 1726853768.30016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853768.30020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853768.30022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853768.30024: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853768.30026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853768.30063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853768.30067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853768.30080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853768.30154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853768.32130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853768.32156: stderr chunk (state=3): >>><<< 30583 1726853768.32159: stdout chunk (state=3): >>><<< 30583 1726853768.32181: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853768.32187: handler run complete 30583 1726853768.32206: attempt loop complete, returning result 30583 1726853768.32208: _execute() done 30583 1726853768.32211: dumping result to json 30583 1726853768.32215: done dumping result, returning 30583 1726853768.32222: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-000000001d3b] 30583 1726853768.32225: sending task result for task 02083763-bbaf-05ea-abc5-000000001d3b 30583 1726853768.32321: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d3b 30583 1726853768.32325: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 30583 1726853768.32423: no more pending results, returning what we have 30583 1726853768.32426: results queue empty 30583 1726853768.32427: checking for any_errors_fatal 30583 1726853768.32432: done checking for any_errors_fatal 30583 1726853768.32439: checking for max_fail_percentage 30583 1726853768.32441: done checking for max_fail_percentage 30583 1726853768.32442: checking to see if all hosts have failed and the running result is not ok 30583 1726853768.32443: done checking to see if all hosts have failed 30583 1726853768.32444: getting the remaining hosts for this loop 30583 1726853768.32446: done getting the remaining hosts for this loop 30583 1726853768.32449: getting the next task for host managed_node2 30583 1726853768.32455: done getting next task for host managed_node2 30583 1726853768.32459: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853768.32463: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853768.32477: getting variables 30583 1726853768.32478: in VariableManager get_vars() 30583 1726853768.32516: Calling all_inventory to load vars for managed_node2 30583 1726853768.32519: Calling groups_inventory to load vars for managed_node2 30583 1726853768.32521: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853768.32529: Calling all_plugins_play to load vars for managed_node2 30583 1726853768.32531: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853768.32534: Calling groups_plugins_play to load vars for managed_node2 30583 1726853768.33565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853768.38691: done with get_vars() 30583 1726853768.38712: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:36:08 -0400 (0:00:00.950) 0:01:43.724 ****** 30583 1726853768.38765: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853768.39045: worker is 1 (out of 1 available) 30583 1726853768.39060: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853768.39074: done queuing things up, now waiting for results queue to drain 30583 1726853768.39077: waiting for pending results... 30583 1726853768.39280: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853768.39392: in run() - task 02083763-bbaf-05ea-abc5-000000001d3c 30583 1726853768.39402: variable 'ansible_search_path' from source: unknown 30583 1726853768.39408: variable 'ansible_search_path' from source: unknown 30583 1726853768.39442: calling self._execute() 30583 1726853768.39524: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853768.39529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853768.39544: variable 'omit' from source: magic vars 30583 1726853768.39846: variable 'ansible_distribution_major_version' from source: facts 30583 1726853768.39861: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853768.39950: variable 'network_state' from source: role '' defaults 30583 1726853768.39958: Evaluated conditional (network_state != {}): False 30583 1726853768.39964: when evaluation is False, skipping this task 30583 1726853768.39968: _execute() done 30583 1726853768.39973: dumping result to json 30583 1726853768.39976: done dumping result, returning 30583 1726853768.39992: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-000000001d3c] 30583 1726853768.39995: sending task result for task 02083763-bbaf-05ea-abc5-000000001d3c 30583 1726853768.40077: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d3c 30583 1726853768.40080: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853768.40139: no more pending results, returning what we have 30583 1726853768.40143: results queue empty 30583 1726853768.40144: checking for any_errors_fatal 30583 1726853768.40157: done checking for any_errors_fatal 30583 1726853768.40158: checking for max_fail_percentage 30583 1726853768.40161: done checking for max_fail_percentage 30583 1726853768.40164: checking to see if all hosts have failed and the running result is not ok 30583 1726853768.40165: done checking to see if all hosts have failed 30583 1726853768.40165: getting the remaining hosts for this loop 30583 1726853768.40167: done getting the remaining hosts for this loop 30583 1726853768.40172: getting the next task for host managed_node2 30583 1726853768.40181: done getting next task for host managed_node2 30583 1726853768.40184: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853768.40191: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853768.40219: getting variables 30583 1726853768.40220: in VariableManager get_vars() 30583 1726853768.40261: Calling all_inventory to load vars for managed_node2 30583 1726853768.40264: Calling groups_inventory to load vars for managed_node2 30583 1726853768.40266: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853768.40284: Calling all_plugins_play to load vars for managed_node2 30583 1726853768.40287: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853768.40290: Calling groups_plugins_play to load vars for managed_node2 30583 1726853768.41086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853768.41982: done with get_vars() 30583 1726853768.42000: done getting variables 30583 1726853768.42045: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:36:08 -0400 (0:00:00.033) 0:01:43.757 ****** 30583 1726853768.42075: entering _queue_task() for managed_node2/debug 30583 1726853768.42332: worker is 1 (out of 1 available) 30583 1726853768.42345: exiting _queue_task() for managed_node2/debug 30583 1726853768.42362: done queuing things up, now waiting for results queue to drain 30583 1726853768.42363: waiting for pending results... 30583 1726853768.42564: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853768.42690: in run() - task 02083763-bbaf-05ea-abc5-000000001d3d 30583 1726853768.42708: variable 'ansible_search_path' from source: unknown 30583 1726853768.42712: variable 'ansible_search_path' from source: unknown 30583 1726853768.42735: calling self._execute() 30583 1726853768.42815: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853768.42820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853768.42826: variable 'omit' from source: magic vars 30583 1726853768.43131: variable 'ansible_distribution_major_version' from source: facts 30583 1726853768.43145: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853768.43149: variable 'omit' from source: magic vars 30583 1726853768.43276: variable 'omit' from source: magic vars 30583 1726853768.43280: variable 'omit' from source: magic vars 30583 1726853768.43282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853768.43315: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853768.43341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853768.43363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853768.43383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853768.43419: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853768.43428: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853768.43436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853768.43537: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853768.43549: Set connection var ansible_timeout to 10 30583 1726853768.43556: Set connection var ansible_connection to ssh 30583 1726853768.43565: Set connection var ansible_shell_executable to /bin/sh 30583 1726853768.43573: Set connection var ansible_shell_type to sh 30583 1726853768.43588: Set connection var ansible_pipelining to False 30583 1726853768.43619: variable 'ansible_shell_executable' from source: unknown 30583 1726853768.43628: variable 'ansible_connection' from source: unknown 30583 1726853768.43676: variable 'ansible_module_compression' from source: unknown 30583 1726853768.43679: variable 'ansible_shell_type' from source: unknown 30583 1726853768.43681: variable 'ansible_shell_executable' from source: unknown 30583 1726853768.43683: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853768.43685: variable 'ansible_pipelining' from source: unknown 30583 1726853768.43687: variable 'ansible_timeout' from source: unknown 30583 1726853768.43689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853768.43814: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853768.43835: variable 'omit' from source: magic vars 30583 1726853768.43848: starting attempt loop 30583 1726853768.43856: running the handler 30583 1726853768.44076: variable '__network_connections_result' from source: set_fact 30583 1726853768.44083: handler run complete 30583 1726853768.44107: attempt loop complete, returning result 30583 1726853768.44115: _execute() done 30583 1726853768.44123: dumping result to json 30583 1726853768.44131: done dumping result, returning 30583 1726853768.44146: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-000000001d3d] 30583 1726853768.44156: sending task result for task 02083763-bbaf-05ea-abc5-000000001d3d ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 30583 1726853768.44451: no more pending results, returning what we have 30583 1726853768.44455: results queue empty 30583 1726853768.44457: checking for any_errors_fatal 30583 1726853768.44463: done checking for any_errors_fatal 30583 1726853768.44464: checking for max_fail_percentage 30583 1726853768.44466: done checking for max_fail_percentage 30583 1726853768.44467: checking to see if all hosts have failed and the running result is not ok 30583 1726853768.44468: done checking to see if all hosts have failed 30583 1726853768.44469: getting the remaining hosts for this loop 30583 1726853768.44473: done getting the remaining hosts for this loop 30583 1726853768.44477: getting the next task for host managed_node2 30583 1726853768.44486: done getting next task for host managed_node2 30583 1726853768.44490: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853768.44499: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853768.44512: getting variables 30583 1726853768.44514: in VariableManager get_vars() 30583 1726853768.44560: Calling all_inventory to load vars for managed_node2 30583 1726853768.44564: Calling groups_inventory to load vars for managed_node2 30583 1726853768.44566: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853768.44778: Calling all_plugins_play to load vars for managed_node2 30583 1726853768.44782: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853768.44787: Calling groups_plugins_play to load vars for managed_node2 30583 1726853768.45465: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d3d 30583 1726853768.45469: WORKER PROCESS EXITING 30583 1726853768.45805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853768.46681: done with get_vars() 30583 1726853768.46700: done getting variables 30583 1726853768.46744: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:36:08 -0400 (0:00:00.047) 0:01:43.804 ****** 30583 1726853768.46778: entering _queue_task() for managed_node2/debug 30583 1726853768.47042: worker is 1 (out of 1 available) 30583 1726853768.47056: exiting _queue_task() for managed_node2/debug 30583 1726853768.47067: done queuing things up, now waiting for results queue to drain 30583 1726853768.47069: waiting for pending results... 30583 1726853768.47272: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853768.47384: in run() - task 02083763-bbaf-05ea-abc5-000000001d3e 30583 1726853768.47395: variable 'ansible_search_path' from source: unknown 30583 1726853768.47400: variable 'ansible_search_path' from source: unknown 30583 1726853768.47431: calling self._execute() 30583 1726853768.47510: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853768.47515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853768.47528: variable 'omit' from source: magic vars 30583 1726853768.47820: variable 'ansible_distribution_major_version' from source: facts 30583 1726853768.47830: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853768.47836: variable 'omit' from source: magic vars 30583 1726853768.47888: variable 'omit' from source: magic vars 30583 1726853768.47910: variable 'omit' from source: magic vars 30583 1726853768.47942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853768.47976: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853768.47993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853768.48007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853768.48016: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853768.48039: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853768.48043: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853768.48046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853768.48120: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853768.48124: Set connection var ansible_timeout to 10 30583 1726853768.48127: Set connection var ansible_connection to ssh 30583 1726853768.48132: Set connection var ansible_shell_executable to /bin/sh 30583 1726853768.48134: Set connection var ansible_shell_type to sh 30583 1726853768.48142: Set connection var ansible_pipelining to False 30583 1726853768.48160: variable 'ansible_shell_executable' from source: unknown 30583 1726853768.48166: variable 'ansible_connection' from source: unknown 30583 1726853768.48174: variable 'ansible_module_compression' from source: unknown 30583 1726853768.48178: variable 'ansible_shell_type' from source: unknown 30583 1726853768.48181: variable 'ansible_shell_executable' from source: unknown 30583 1726853768.48183: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853768.48185: variable 'ansible_pipelining' from source: unknown 30583 1726853768.48187: variable 'ansible_timeout' from source: unknown 30583 1726853768.48189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853768.48288: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853768.48297: variable 'omit' from source: magic vars 30583 1726853768.48301: starting attempt loop 30583 1726853768.48306: running the handler 30583 1726853768.48347: variable '__network_connections_result' from source: set_fact 30583 1726853768.48405: variable '__network_connections_result' from source: set_fact 30583 1726853768.48487: handler run complete 30583 1726853768.48504: attempt loop complete, returning result 30583 1726853768.48507: _execute() done 30583 1726853768.48509: dumping result to json 30583 1726853768.48512: done dumping result, returning 30583 1726853768.48520: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-000000001d3e] 30583 1726853768.48522: sending task result for task 02083763-bbaf-05ea-abc5-000000001d3e 30583 1726853768.48610: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d3e 30583 1726853768.48613: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 30583 1726853768.48727: no more pending results, returning what we have 30583 1726853768.48731: results queue empty 30583 1726853768.48732: checking for any_errors_fatal 30583 1726853768.48738: done checking for any_errors_fatal 30583 1726853768.48738: checking for max_fail_percentage 30583 1726853768.48740: done checking for max_fail_percentage 30583 1726853768.48741: checking to see if all hosts have failed and the running result is not ok 30583 1726853768.48744: done checking to see if all hosts have failed 30583 1726853768.48744: getting the remaining hosts for this loop 30583 1726853768.48746: done getting the remaining hosts for this loop 30583 1726853768.48749: getting the next task for host managed_node2 30583 1726853768.48756: done getting next task for host managed_node2 30583 1726853768.48760: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853768.48764: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853768.48776: getting variables 30583 1726853768.48778: in VariableManager get_vars() 30583 1726853768.48812: Calling all_inventory to load vars for managed_node2 30583 1726853768.48814: Calling groups_inventory to load vars for managed_node2 30583 1726853768.48816: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853768.48829: Calling all_plugins_play to load vars for managed_node2 30583 1726853768.48832: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853768.48834: Calling groups_plugins_play to load vars for managed_node2 30583 1726853768.49630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853768.50527: done with get_vars() 30583 1726853768.50544: done getting variables 30583 1726853768.50589: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:36:08 -0400 (0:00:00.038) 0:01:43.843 ****** 30583 1726853768.50617: entering _queue_task() for managed_node2/debug 30583 1726853768.50912: worker is 1 (out of 1 available) 30583 1726853768.50926: exiting _queue_task() for managed_node2/debug 30583 1726853768.50939: done queuing things up, now waiting for results queue to drain 30583 1726853768.50940: waiting for pending results... 30583 1726853768.51141: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853768.51242: in run() - task 02083763-bbaf-05ea-abc5-000000001d3f 30583 1726853768.51253: variable 'ansible_search_path' from source: unknown 30583 1726853768.51257: variable 'ansible_search_path' from source: unknown 30583 1726853768.51292: calling self._execute() 30583 1726853768.51373: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853768.51377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853768.51389: variable 'omit' from source: magic vars 30583 1726853768.51683: variable 'ansible_distribution_major_version' from source: facts 30583 1726853768.51693: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853768.51786: variable 'network_state' from source: role '' defaults 30583 1726853768.51795: Evaluated conditional (network_state != {}): False 30583 1726853768.51798: when evaluation is False, skipping this task 30583 1726853768.51801: _execute() done 30583 1726853768.51803: dumping result to json 30583 1726853768.51806: done dumping result, returning 30583 1726853768.51816: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-000000001d3f] 30583 1726853768.51819: sending task result for task 02083763-bbaf-05ea-abc5-000000001d3f 30583 1726853768.51911: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d3f 30583 1726853768.51913: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853768.51979: no more pending results, returning what we have 30583 1726853768.51983: results queue empty 30583 1726853768.51984: checking for any_errors_fatal 30583 1726853768.51994: done checking for any_errors_fatal 30583 1726853768.51994: checking for max_fail_percentage 30583 1726853768.51996: done checking for max_fail_percentage 30583 1726853768.51997: checking to see if all hosts have failed and the running result is not ok 30583 1726853768.51998: done checking to see if all hosts have failed 30583 1726853768.51999: getting the remaining hosts for this loop 30583 1726853768.52001: done getting the remaining hosts for this loop 30583 1726853768.52004: getting the next task for host managed_node2 30583 1726853768.52012: done getting next task for host managed_node2 30583 1726853768.52016: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853768.52021: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853768.52045: getting variables 30583 1726853768.52047: in VariableManager get_vars() 30583 1726853768.52088: Calling all_inventory to load vars for managed_node2 30583 1726853768.52091: Calling groups_inventory to load vars for managed_node2 30583 1726853768.52093: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853768.52102: Calling all_plugins_play to load vars for managed_node2 30583 1726853768.52105: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853768.52107: Calling groups_plugins_play to load vars for managed_node2 30583 1726853768.53004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853768.53879: done with get_vars() 30583 1726853768.53894: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:36:08 -0400 (0:00:00.033) 0:01:43.876 ****** 30583 1726853768.53966: entering _queue_task() for managed_node2/ping 30583 1726853768.54201: worker is 1 (out of 1 available) 30583 1726853768.54215: exiting _queue_task() for managed_node2/ping 30583 1726853768.54228: done queuing things up, now waiting for results queue to drain 30583 1726853768.54230: waiting for pending results... 30583 1726853768.54438: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853768.54547: in run() - task 02083763-bbaf-05ea-abc5-000000001d40 30583 1726853768.54559: variable 'ansible_search_path' from source: unknown 30583 1726853768.54562: variable 'ansible_search_path' from source: unknown 30583 1726853768.54598: calling self._execute() 30583 1726853768.54675: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853768.54681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853768.54694: variable 'omit' from source: magic vars 30583 1726853768.54990: variable 'ansible_distribution_major_version' from source: facts 30583 1726853768.54999: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853768.55006: variable 'omit' from source: magic vars 30583 1726853768.55080: variable 'omit' from source: magic vars 30583 1726853768.55177: variable 'omit' from source: magic vars 30583 1726853768.55180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853768.55183: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853768.55195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853768.55215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853768.55231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853768.55267: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853768.55281: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853768.55288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853768.55393: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853768.55404: Set connection var ansible_timeout to 10 30583 1726853768.55411: Set connection var ansible_connection to ssh 30583 1726853768.55420: Set connection var ansible_shell_executable to /bin/sh 30583 1726853768.55427: Set connection var ansible_shell_type to sh 30583 1726853768.55441: Set connection var ansible_pipelining to False 30583 1726853768.55473: variable 'ansible_shell_executable' from source: unknown 30583 1726853768.55482: variable 'ansible_connection' from source: unknown 30583 1726853768.55490: variable 'ansible_module_compression' from source: unknown 30583 1726853768.55497: variable 'ansible_shell_type' from source: unknown 30583 1726853768.55503: variable 'ansible_shell_executable' from source: unknown 30583 1726853768.55510: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853768.55517: variable 'ansible_pipelining' from source: unknown 30583 1726853768.55562: variable 'ansible_timeout' from source: unknown 30583 1726853768.55566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853768.55876: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853768.55881: variable 'omit' from source: magic vars 30583 1726853768.55883: starting attempt loop 30583 1726853768.55885: running the handler 30583 1726853768.55887: _low_level_execute_command(): starting 30583 1726853768.55890: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853768.56442: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853768.56456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853768.56481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853768.56499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853768.56597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853768.56622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853768.56757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853768.58447: stdout chunk (state=3): >>>/root <<< 30583 1726853768.58544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853768.58580: stderr chunk (state=3): >>><<< 30583 1726853768.58583: stdout chunk (state=3): >>><<< 30583 1726853768.58604: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853768.58616: _low_level_execute_command(): starting 30583 1726853768.58621: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577 `" && echo ansible-tmp-1726853768.5860448-35463-12005659564577="` echo /root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577 `" ) && sleep 0' 30583 1726853768.59047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853768.59050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853768.59053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853768.59063: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853768.59066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853768.59107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853768.59115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853768.59194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853768.61577: stdout chunk (state=3): >>>ansible-tmp-1726853768.5860448-35463-12005659564577=/root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577 <<< 30583 1726853768.61582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853768.61584: stdout chunk (state=3): >>><<< 30583 1726853768.61587: stderr chunk (state=3): >>><<< 30583 1726853768.61589: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853768.5860448-35463-12005659564577=/root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853768.61591: variable 'ansible_module_compression' from source: unknown 30583 1726853768.61593: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853768.61595: variable 'ansible_facts' from source: unknown 30583 1726853768.62108: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577/AnsiballZ_ping.py 30583 1726853768.62254: Sending initial data 30583 1726853768.62483: Sent initial data (152 bytes) 30583 1726853768.63013: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853768.63017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853768.63020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853768.63023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853768.63035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853768.63286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853768.63317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853768.65017: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853768.65101: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853768.65174: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpnrhs19my /root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577/AnsiballZ_ping.py <<< 30583 1726853768.65196: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577/AnsiballZ_ping.py" <<< 30583 1726853768.65279: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpnrhs19my" to remote "/root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577/AnsiballZ_ping.py" <<< 30583 1726853768.66308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853768.66320: stdout chunk (state=3): >>><<< 30583 1726853768.66334: stderr chunk (state=3): >>><<< 30583 1726853768.66403: done transferring module to remote 30583 1726853768.66423: _low_level_execute_command(): starting 30583 1726853768.66466: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577/ /root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577/AnsiballZ_ping.py && sleep 0' 30583 1726853768.67135: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853768.67144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853768.67155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853768.67170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853768.67190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853768.67196: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853768.67218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853768.67221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853768.67224: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853768.67231: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853768.67305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853768.67308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853768.67310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853768.67312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853768.67316: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853768.67318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853768.67341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853768.67364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853768.67459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853768.69461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853768.69464: stdout chunk (state=3): >>><<< 30583 1726853768.69466: stderr chunk (state=3): >>><<< 30583 1726853768.69640: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853768.69644: _low_level_execute_command(): starting 30583 1726853768.69647: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577/AnsiballZ_ping.py && sleep 0' 30583 1726853768.70559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853768.70636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853768.70663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853768.70697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853768.70862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853768.70905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853768.70941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853768.71037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853768.86569: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853768.87957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853768.87984: stderr chunk (state=3): >>><<< 30583 1726853768.87987: stdout chunk (state=3): >>><<< 30583 1726853768.88004: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853768.88026: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853768.88036: _low_level_execute_command(): starting 30583 1726853768.88040: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853768.5860448-35463-12005659564577/ > /dev/null 2>&1 && sleep 0' 30583 1726853768.88504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853768.88508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853768.88510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853768.88512: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853768.88515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853768.88560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853768.88579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853768.88639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853768.90511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853768.90533: stderr chunk (state=3): >>><<< 30583 1726853768.90536: stdout chunk (state=3): >>><<< 30583 1726853768.90548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853768.90554: handler run complete 30583 1726853768.90572: attempt loop complete, returning result 30583 1726853768.90575: _execute() done 30583 1726853768.90577: dumping result to json 30583 1726853768.90579: done dumping result, returning 30583 1726853768.90590: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-000000001d40] 30583 1726853768.90593: sending task result for task 02083763-bbaf-05ea-abc5-000000001d40 30583 1726853768.90691: done sending task result for task 02083763-bbaf-05ea-abc5-000000001d40 30583 1726853768.90694: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853768.90761: no more pending results, returning what we have 30583 1726853768.90764: results queue empty 30583 1726853768.90765: checking for any_errors_fatal 30583 1726853768.90773: done checking for any_errors_fatal 30583 1726853768.90774: checking for max_fail_percentage 30583 1726853768.90776: done checking for max_fail_percentage 30583 1726853768.90777: checking to see if all hosts have failed and the running result is not ok 30583 1726853768.90778: done checking to see if all hosts have failed 30583 1726853768.90778: getting the remaining hosts for this loop 30583 1726853768.90780: done getting the remaining hosts for this loop 30583 1726853768.90783: getting the next task for host managed_node2 30583 1726853768.90794: done getting next task for host managed_node2 30583 1726853768.90796: ^ task is: TASK: meta (role_complete) 30583 1726853768.90802: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853768.90814: getting variables 30583 1726853768.90815: in VariableManager get_vars() 30583 1726853768.90861: Calling all_inventory to load vars for managed_node2 30583 1726853768.90864: Calling groups_inventory to load vars for managed_node2 30583 1726853768.90866: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853768.90884: Calling all_plugins_play to load vars for managed_node2 30583 1726853768.90887: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853768.90890: Calling groups_plugins_play to load vars for managed_node2 30583 1726853768.91726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853768.93042: done with get_vars() 30583 1726853768.93072: done getting variables 30583 1726853768.93152: done queuing things up, now waiting for results queue to drain 30583 1726853768.93154: results queue empty 30583 1726853768.93155: checking for any_errors_fatal 30583 1726853768.93160: done checking for any_errors_fatal 30583 1726853768.93161: checking for max_fail_percentage 30583 1726853768.93162: done checking for max_fail_percentage 30583 1726853768.93163: checking to see if all hosts have failed and the running result is not ok 30583 1726853768.93163: done checking to see if all hosts have failed 30583 1726853768.93164: getting the remaining hosts for this loop 30583 1726853768.93165: done getting the remaining hosts for this loop 30583 1726853768.93168: getting the next task for host managed_node2 30583 1726853768.93175: done getting next task for host managed_node2 30583 1726853768.93177: ^ task is: TASK: Asserts 30583 1726853768.93180: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853768.93183: getting variables 30583 1726853768.93184: in VariableManager get_vars() 30583 1726853768.93196: Calling all_inventory to load vars for managed_node2 30583 1726853768.93198: Calling groups_inventory to load vars for managed_node2 30583 1726853768.93200: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853768.93205: Calling all_plugins_play to load vars for managed_node2 30583 1726853768.93207: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853768.93210: Calling groups_plugins_play to load vars for managed_node2 30583 1726853768.94116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853768.95010: done with get_vars() 30583 1726853768.95029: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 13:36:08 -0400 (0:00:00.411) 0:01:44.288 ****** 30583 1726853768.95089: entering _queue_task() for managed_node2/include_tasks 30583 1726853768.95365: worker is 1 (out of 1 available) 30583 1726853768.95383: exiting _queue_task() for managed_node2/include_tasks 30583 1726853768.95395: done queuing things up, now waiting for results queue to drain 30583 1726853768.95397: waiting for pending results... 30583 1726853768.95601: running TaskExecutor() for managed_node2/TASK: Asserts 30583 1726853768.95688: in run() - task 02083763-bbaf-05ea-abc5-000000001749 30583 1726853768.95699: variable 'ansible_search_path' from source: unknown 30583 1726853768.95702: variable 'ansible_search_path' from source: unknown 30583 1726853768.95743: variable 'lsr_assert' from source: include params 30583 1726853768.95919: variable 'lsr_assert' from source: include params 30583 1726853768.95985: variable 'omit' from source: magic vars 30583 1726853768.96090: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853768.96098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853768.96106: variable 'omit' from source: magic vars 30583 1726853768.96275: variable 'ansible_distribution_major_version' from source: facts 30583 1726853768.96284: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853768.96288: variable 'item' from source: unknown 30583 1726853768.96333: variable 'item' from source: unknown 30583 1726853768.96356: variable 'item' from source: unknown 30583 1726853768.96425: variable 'item' from source: unknown 30583 1726853768.96545: dumping result to json 30583 1726853768.96547: done dumping result, returning 30583 1726853768.96549: done running TaskExecutor() for managed_node2/TASK: Asserts [02083763-bbaf-05ea-abc5-000000001749] 30583 1726853768.96551: sending task result for task 02083763-bbaf-05ea-abc5-000000001749 30583 1726853768.96593: done sending task result for task 02083763-bbaf-05ea-abc5-000000001749 30583 1726853768.96596: WORKER PROCESS EXITING 30583 1726853768.96702: no more pending results, returning what we have 30583 1726853768.96707: in VariableManager get_vars() 30583 1726853768.96755: Calling all_inventory to load vars for managed_node2 30583 1726853768.96760: Calling groups_inventory to load vars for managed_node2 30583 1726853768.96764: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853768.96779: Calling all_plugins_play to load vars for managed_node2 30583 1726853768.96787: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853768.96793: Calling groups_plugins_play to load vars for managed_node2 30583 1726853768.98583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853769.00289: done with get_vars() 30583 1726853769.00315: variable 'ansible_search_path' from source: unknown 30583 1726853769.00317: variable 'ansible_search_path' from source: unknown 30583 1726853769.00366: we have included files to process 30583 1726853769.00367: generating all_blocks data 30583 1726853769.00370: done generating all_blocks data 30583 1726853769.00377: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30583 1726853769.00378: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30583 1726853769.00380: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30583 1726853769.00493: in VariableManager get_vars() 30583 1726853769.00518: done with get_vars() 30583 1726853769.00639: done processing included file 30583 1726853769.00641: iterating over new_blocks loaded from include file 30583 1726853769.00643: in VariableManager get_vars() 30583 1726853769.00664: done with get_vars() 30583 1726853769.00666: filtering new block on tags 30583 1726853769.00716: done filtering new block on tags 30583 1726853769.00719: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 => (item=tasks/assert_profile_absent.yml) 30583 1726853769.00725: extending task lists for all hosts with included blocks 30583 1726853769.01953: done extending task lists 30583 1726853769.01955: done processing included files 30583 1726853769.01956: results queue empty 30583 1726853769.01957: checking for any_errors_fatal 30583 1726853769.01961: done checking for any_errors_fatal 30583 1726853769.01962: checking for max_fail_percentage 30583 1726853769.01963: done checking for max_fail_percentage 30583 1726853769.01965: checking to see if all hosts have failed and the running result is not ok 30583 1726853769.01965: done checking to see if all hosts have failed 30583 1726853769.01966: getting the remaining hosts for this loop 30583 1726853769.01968: done getting the remaining hosts for this loop 30583 1726853769.01972: getting the next task for host managed_node2 30583 1726853769.01977: done getting next task for host managed_node2 30583 1726853769.01979: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30583 1726853769.01991: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853769.01995: getting variables 30583 1726853769.01996: in VariableManager get_vars() 30583 1726853769.02010: Calling all_inventory to load vars for managed_node2 30583 1726853769.02013: Calling groups_inventory to load vars for managed_node2 30583 1726853769.02015: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853769.02022: Calling all_plugins_play to load vars for managed_node2 30583 1726853769.02024: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853769.02027: Calling groups_plugins_play to load vars for managed_node2 30583 1726853769.03346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853769.04897: done with get_vars() 30583 1726853769.04930: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 13:36:09 -0400 (0:00:00.099) 0:01:44.387 ****** 30583 1726853769.05009: entering _queue_task() for managed_node2/include_tasks 30583 1726853769.05606: worker is 1 (out of 1 available) 30583 1726853769.05616: exiting _queue_task() for managed_node2/include_tasks 30583 1726853769.05626: done queuing things up, now waiting for results queue to drain 30583 1726853769.05627: waiting for pending results... 30583 1726853769.06291: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 30583 1726853769.06299: in run() - task 02083763-bbaf-05ea-abc5-000000001e99 30583 1726853769.06303: variable 'ansible_search_path' from source: unknown 30583 1726853769.06307: variable 'ansible_search_path' from source: unknown 30583 1726853769.06314: calling self._execute() 30583 1726853769.06317: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853769.06319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853769.06321: variable 'omit' from source: magic vars 30583 1726853769.06534: variable 'ansible_distribution_major_version' from source: facts 30583 1726853769.06777: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853769.06780: _execute() done 30583 1726853769.06781: dumping result to json 30583 1726853769.06783: done dumping result, returning 30583 1726853769.06787: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-05ea-abc5-000000001e99] 30583 1726853769.06788: sending task result for task 02083763-bbaf-05ea-abc5-000000001e99 30583 1726853769.06850: done sending task result for task 02083763-bbaf-05ea-abc5-000000001e99 30583 1726853769.06881: no more pending results, returning what we have 30583 1726853769.06886: in VariableManager get_vars() 30583 1726853769.06931: Calling all_inventory to load vars for managed_node2 30583 1726853769.06941: Calling groups_inventory to load vars for managed_node2 30583 1726853769.06946: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853769.06962: Calling all_plugins_play to load vars for managed_node2 30583 1726853769.06966: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853769.06972: Calling groups_plugins_play to load vars for managed_node2 30583 1726853769.07513: WORKER PROCESS EXITING 30583 1726853769.09097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853769.10755: done with get_vars() 30583 1726853769.10782: variable 'ansible_search_path' from source: unknown 30583 1726853769.10784: variable 'ansible_search_path' from source: unknown 30583 1726853769.10793: variable 'item' from source: include params 30583 1726853769.10912: variable 'item' from source: include params 30583 1726853769.10946: we have included files to process 30583 1726853769.10948: generating all_blocks data 30583 1726853769.10949: done generating all_blocks data 30583 1726853769.10951: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853769.10951: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853769.10954: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853769.11895: done processing included file 30583 1726853769.11898: iterating over new_blocks loaded from include file 30583 1726853769.11899: in VariableManager get_vars() 30583 1726853769.11917: done with get_vars() 30583 1726853769.11919: filtering new block on tags 30583 1726853769.11991: done filtering new block on tags 30583 1726853769.11994: in VariableManager get_vars() 30583 1726853769.12009: done with get_vars() 30583 1726853769.12011: filtering new block on tags 30583 1726853769.12066: done filtering new block on tags 30583 1726853769.12069: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 30583 1726853769.12075: extending task lists for all hosts with included blocks 30583 1726853769.12310: done extending task lists 30583 1726853769.12312: done processing included files 30583 1726853769.12312: results queue empty 30583 1726853769.12313: checking for any_errors_fatal 30583 1726853769.12317: done checking for any_errors_fatal 30583 1726853769.12318: checking for max_fail_percentage 30583 1726853769.12319: done checking for max_fail_percentage 30583 1726853769.12320: checking to see if all hosts have failed and the running result is not ok 30583 1726853769.12321: done checking to see if all hosts have failed 30583 1726853769.12321: getting the remaining hosts for this loop 30583 1726853769.12323: done getting the remaining hosts for this loop 30583 1726853769.12325: getting the next task for host managed_node2 30583 1726853769.12329: done getting next task for host managed_node2 30583 1726853769.12332: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30583 1726853769.12335: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853769.12338: getting variables 30583 1726853769.12339: in VariableManager get_vars() 30583 1726853769.12349: Calling all_inventory to load vars for managed_node2 30583 1726853769.12352: Calling groups_inventory to load vars for managed_node2 30583 1726853769.12354: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853769.12360: Calling all_plugins_play to load vars for managed_node2 30583 1726853769.12363: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853769.12366: Calling groups_plugins_play to load vars for managed_node2 30583 1726853769.13510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853769.15269: done with get_vars() 30583 1726853769.15394: done getting variables 30583 1726853769.15436: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:36:09 -0400 (0:00:00.104) 0:01:44.491 ****** 30583 1726853769.15469: entering _queue_task() for managed_node2/set_fact 30583 1726853769.16234: worker is 1 (out of 1 available) 30583 1726853769.16250: exiting _queue_task() for managed_node2/set_fact 30583 1726853769.16263: done queuing things up, now waiting for results queue to drain 30583 1726853769.16265: waiting for pending results... 30583 1726853769.16923: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 30583 1726853769.17191: in run() - task 02083763-bbaf-05ea-abc5-000000001f17 30583 1726853769.17212: variable 'ansible_search_path' from source: unknown 30583 1726853769.17216: variable 'ansible_search_path' from source: unknown 30583 1726853769.17251: calling self._execute() 30583 1726853769.17541: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853769.17547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853769.17556: variable 'omit' from source: magic vars 30583 1726853769.18441: variable 'ansible_distribution_major_version' from source: facts 30583 1726853769.18575: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853769.18580: variable 'omit' from source: magic vars 30583 1726853769.18583: variable 'omit' from source: magic vars 30583 1726853769.18655: variable 'omit' from source: magic vars 30583 1726853769.18668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853769.19282: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853769.19285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853769.19288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853769.19290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853769.19293: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853769.19295: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853769.19297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853769.19299: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853769.19301: Set connection var ansible_timeout to 10 30583 1726853769.19303: Set connection var ansible_connection to ssh 30583 1726853769.19305: Set connection var ansible_shell_executable to /bin/sh 30583 1726853769.19307: Set connection var ansible_shell_type to sh 30583 1726853769.19308: Set connection var ansible_pipelining to False 30583 1726853769.19310: variable 'ansible_shell_executable' from source: unknown 30583 1726853769.19312: variable 'ansible_connection' from source: unknown 30583 1726853769.19315: variable 'ansible_module_compression' from source: unknown 30583 1726853769.19316: variable 'ansible_shell_type' from source: unknown 30583 1726853769.19318: variable 'ansible_shell_executable' from source: unknown 30583 1726853769.19320: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853769.19322: variable 'ansible_pipelining' from source: unknown 30583 1726853769.19324: variable 'ansible_timeout' from source: unknown 30583 1726853769.19326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853769.19431: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853769.19442: variable 'omit' from source: magic vars 30583 1726853769.19453: starting attempt loop 30583 1726853769.19456: running the handler 30583 1726853769.19473: handler run complete 30583 1726853769.19490: attempt loop complete, returning result 30583 1726853769.19493: _execute() done 30583 1726853769.19495: dumping result to json 30583 1726853769.19497: done dumping result, returning 30583 1726853769.19502: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-05ea-abc5-000000001f17] 30583 1726853769.19507: sending task result for task 02083763-bbaf-05ea-abc5-000000001f17 30583 1726853769.19877: done sending task result for task 02083763-bbaf-05ea-abc5-000000001f17 30583 1726853769.19880: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30583 1726853769.19923: no more pending results, returning what we have 30583 1726853769.19926: results queue empty 30583 1726853769.19927: checking for any_errors_fatal 30583 1726853769.19928: done checking for any_errors_fatal 30583 1726853769.19929: checking for max_fail_percentage 30583 1726853769.19931: done checking for max_fail_percentage 30583 1726853769.19931: checking to see if all hosts have failed and the running result is not ok 30583 1726853769.19932: done checking to see if all hosts have failed 30583 1726853769.19933: getting the remaining hosts for this loop 30583 1726853769.19934: done getting the remaining hosts for this loop 30583 1726853769.19937: getting the next task for host managed_node2 30583 1726853769.19945: done getting next task for host managed_node2 30583 1726853769.19947: ^ task is: TASK: Stat profile file 30583 1726853769.19952: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853769.19956: getting variables 30583 1726853769.19957: in VariableManager get_vars() 30583 1726853769.19994: Calling all_inventory to load vars for managed_node2 30583 1726853769.19997: Calling groups_inventory to load vars for managed_node2 30583 1726853769.20000: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853769.20010: Calling all_plugins_play to load vars for managed_node2 30583 1726853769.20013: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853769.20016: Calling groups_plugins_play to load vars for managed_node2 30583 1726853769.21630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853769.24811: done with get_vars() 30583 1726853769.24844: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:36:09 -0400 (0:00:00.096) 0:01:44.588 ****** 30583 1726853769.25152: entering _queue_task() for managed_node2/stat 30583 1726853769.25929: worker is 1 (out of 1 available) 30583 1726853769.25943: exiting _queue_task() for managed_node2/stat 30583 1726853769.25955: done queuing things up, now waiting for results queue to drain 30583 1726853769.25956: waiting for pending results... 30583 1726853769.26458: running TaskExecutor() for managed_node2/TASK: Stat profile file 30583 1726853769.26815: in run() - task 02083763-bbaf-05ea-abc5-000000001f18 30583 1726853769.26835: variable 'ansible_search_path' from source: unknown 30583 1726853769.26843: variable 'ansible_search_path' from source: unknown 30583 1726853769.26923: calling self._execute() 30583 1726853769.27277: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853769.27281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853769.27283: variable 'omit' from source: magic vars 30583 1726853769.28015: variable 'ansible_distribution_major_version' from source: facts 30583 1726853769.28032: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853769.28043: variable 'omit' from source: magic vars 30583 1726853769.28275: variable 'omit' from source: magic vars 30583 1726853769.28340: variable 'profile' from source: play vars 30583 1726853769.28383: variable 'interface' from source: play vars 30583 1726853769.28477: variable 'interface' from source: play vars 30583 1726853769.28676: variable 'omit' from source: magic vars 30583 1726853769.28695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853769.28838: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853769.28868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853769.28953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853769.28956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853769.28958: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853769.28961: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853769.28963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853769.29077: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853769.29089: Set connection var ansible_timeout to 10 30583 1726853769.29096: Set connection var ansible_connection to ssh 30583 1726853769.29105: Set connection var ansible_shell_executable to /bin/sh 30583 1726853769.29110: Set connection var ansible_shell_type to sh 30583 1726853769.29124: Set connection var ansible_pipelining to False 30583 1726853769.29153: variable 'ansible_shell_executable' from source: unknown 30583 1726853769.29161: variable 'ansible_connection' from source: unknown 30583 1726853769.29177: variable 'ansible_module_compression' from source: unknown 30583 1726853769.29185: variable 'ansible_shell_type' from source: unknown 30583 1726853769.29192: variable 'ansible_shell_executable' from source: unknown 30583 1726853769.29199: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853769.29206: variable 'ansible_pipelining' from source: unknown 30583 1726853769.29213: variable 'ansible_timeout' from source: unknown 30583 1726853769.29220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853769.29433: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853769.29450: variable 'omit' from source: magic vars 30583 1726853769.29460: starting attempt loop 30583 1726853769.29467: running the handler 30583 1726853769.29573: _low_level_execute_command(): starting 30583 1726853769.29578: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853769.30347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853769.30368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.30398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853769.30414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853769.30680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853769.30696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853769.32450: stdout chunk (state=3): >>>/root <<< 30583 1726853769.32606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853769.32618: stdout chunk (state=3): >>><<< 30583 1726853769.32628: stderr chunk (state=3): >>><<< 30583 1726853769.32714: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853769.32730: _low_level_execute_command(): starting 30583 1726853769.33062: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788 `" && echo ansible-tmp-1726853769.3270252-35499-230908833111788="` echo /root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788 `" ) && sleep 0' 30583 1726853769.34168: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853769.34175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.34186: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853769.34188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.34339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853769.34351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853769.34495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853769.36555: stdout chunk (state=3): >>>ansible-tmp-1726853769.3270252-35499-230908833111788=/root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788 <<< 30583 1726853769.36724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853769.36770: stderr chunk (state=3): >>><<< 30583 1726853769.36776: stdout chunk (state=3): >>><<< 30583 1726853769.37062: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853769.3270252-35499-230908833111788=/root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853769.37066: variable 'ansible_module_compression' from source: unknown 30583 1726853769.37093: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853769.37476: variable 'ansible_facts' from source: unknown 30583 1726853769.37479: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788/AnsiballZ_stat.py 30583 1726853769.38094: Sending initial data 30583 1726853769.38097: Sent initial data (153 bytes) 30583 1726853769.38441: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853769.38456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853769.38523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853769.38538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.38603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853769.38620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853769.38641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853769.39076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853769.40735: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853769.40862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853769.40932: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmprexhr5m7 /root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788/AnsiballZ_stat.py <<< 30583 1726853769.40945: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788/AnsiballZ_stat.py" <<< 30583 1726853769.41188: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmprexhr5m7" to remote "/root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788/AnsiballZ_stat.py" <<< 30583 1726853769.42039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853769.42106: stderr chunk (state=3): >>><<< 30583 1726853769.42115: stdout chunk (state=3): >>><<< 30583 1726853769.42142: done transferring module to remote 30583 1726853769.42162: _low_level_execute_command(): starting 30583 1726853769.42174: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788/ /root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788/AnsiballZ_stat.py && sleep 0' 30583 1726853769.42778: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853769.42883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853769.42910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853769.43000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853769.44953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853769.44992: stderr chunk (state=3): >>><<< 30583 1726853769.45000: stdout chunk (state=3): >>><<< 30583 1726853769.45020: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853769.45028: _low_level_execute_command(): starting 30583 1726853769.45038: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788/AnsiballZ_stat.py && sleep 0' 30583 1726853769.46342: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853769.46363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853769.46379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853769.46446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.46730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853769.46876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853769.47284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853769.63064: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853769.64478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853769.64503: stderr chunk (state=3): >>><<< 30583 1726853769.64507: stdout chunk (state=3): >>><<< 30583 1726853769.64535: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853769.64561: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853769.64568: _low_level_execute_command(): starting 30583 1726853769.64574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853769.3270252-35499-230908833111788/ > /dev/null 2>&1 && sleep 0' 30583 1726853769.65038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853769.65042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.65044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853769.65047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853769.65049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.65099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853769.65106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853769.65108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853769.65178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853769.67176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853769.67180: stderr chunk (state=3): >>><<< 30583 1726853769.67182: stdout chunk (state=3): >>><<< 30583 1726853769.67185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853769.67187: handler run complete 30583 1726853769.67189: attempt loop complete, returning result 30583 1726853769.67191: _execute() done 30583 1726853769.67193: dumping result to json 30583 1726853769.67195: done dumping result, returning 30583 1726853769.67197: done running TaskExecutor() for managed_node2/TASK: Stat profile file [02083763-bbaf-05ea-abc5-000000001f18] 30583 1726853769.67202: sending task result for task 02083763-bbaf-05ea-abc5-000000001f18 ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30583 1726853769.67367: no more pending results, returning what we have 30583 1726853769.67374: results queue empty 30583 1726853769.67375: checking for any_errors_fatal 30583 1726853769.67385: done checking for any_errors_fatal 30583 1726853769.67386: checking for max_fail_percentage 30583 1726853769.67388: done checking for max_fail_percentage 30583 1726853769.67389: checking to see if all hosts have failed and the running result is not ok 30583 1726853769.67390: done checking to see if all hosts have failed 30583 1726853769.67391: getting the remaining hosts for this loop 30583 1726853769.67393: done getting the remaining hosts for this loop 30583 1726853769.67398: getting the next task for host managed_node2 30583 1726853769.67407: done getting next task for host managed_node2 30583 1726853769.67410: ^ task is: TASK: Set NM profile exist flag based on the profile files 30583 1726853769.67415: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853769.67420: getting variables 30583 1726853769.67422: in VariableManager get_vars() 30583 1726853769.67470: Calling all_inventory to load vars for managed_node2 30583 1726853769.67556: Calling groups_inventory to load vars for managed_node2 30583 1726853769.67561: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853769.67568: done sending task result for task 02083763-bbaf-05ea-abc5-000000001f18 30583 1726853769.67573: WORKER PROCESS EXITING 30583 1726853769.67585: Calling all_plugins_play to load vars for managed_node2 30583 1726853769.67589: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853769.67593: Calling groups_plugins_play to load vars for managed_node2 30583 1726853769.68633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853769.69492: done with get_vars() 30583 1726853769.69510: done getting variables 30583 1726853769.69554: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:36:09 -0400 (0:00:00.444) 0:01:45.033 ****** 30583 1726853769.69582: entering _queue_task() for managed_node2/set_fact 30583 1726853769.69841: worker is 1 (out of 1 available) 30583 1726853769.69857: exiting _queue_task() for managed_node2/set_fact 30583 1726853769.69869: done queuing things up, now waiting for results queue to drain 30583 1726853769.69872: waiting for pending results... 30583 1726853769.70056: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 30583 1726853769.70149: in run() - task 02083763-bbaf-05ea-abc5-000000001f19 30583 1726853769.70160: variable 'ansible_search_path' from source: unknown 30583 1726853769.70164: variable 'ansible_search_path' from source: unknown 30583 1726853769.70196: calling self._execute() 30583 1726853769.70273: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853769.70276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853769.70285: variable 'omit' from source: magic vars 30583 1726853769.70565: variable 'ansible_distribution_major_version' from source: facts 30583 1726853769.70576: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853769.70664: variable 'profile_stat' from source: set_fact 30583 1726853769.70675: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853769.70679: when evaluation is False, skipping this task 30583 1726853769.70681: _execute() done 30583 1726853769.70684: dumping result to json 30583 1726853769.70686: done dumping result, returning 30583 1726853769.70694: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-05ea-abc5-000000001f19] 30583 1726853769.70698: sending task result for task 02083763-bbaf-05ea-abc5-000000001f19 30583 1726853769.70780: done sending task result for task 02083763-bbaf-05ea-abc5-000000001f19 30583 1726853769.70783: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853769.70827: no more pending results, returning what we have 30583 1726853769.70832: results queue empty 30583 1726853769.70833: checking for any_errors_fatal 30583 1726853769.70843: done checking for any_errors_fatal 30583 1726853769.70843: checking for max_fail_percentage 30583 1726853769.70845: done checking for max_fail_percentage 30583 1726853769.70846: checking to see if all hosts have failed and the running result is not ok 30583 1726853769.70847: done checking to see if all hosts have failed 30583 1726853769.70847: getting the remaining hosts for this loop 30583 1726853769.70850: done getting the remaining hosts for this loop 30583 1726853769.70854: getting the next task for host managed_node2 30583 1726853769.70862: done getting next task for host managed_node2 30583 1726853769.70865: ^ task is: TASK: Get NM profile info 30583 1726853769.70873: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853769.70879: getting variables 30583 1726853769.70880: in VariableManager get_vars() 30583 1726853769.70920: Calling all_inventory to load vars for managed_node2 30583 1726853769.70922: Calling groups_inventory to load vars for managed_node2 30583 1726853769.70926: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853769.70936: Calling all_plugins_play to load vars for managed_node2 30583 1726853769.70939: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853769.70941: Calling groups_plugins_play to load vars for managed_node2 30583 1726853769.71750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853769.72672: done with get_vars() 30583 1726853769.72691: done getting variables 30583 1726853769.72735: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:36:09 -0400 (0:00:00.031) 0:01:45.064 ****** 30583 1726853769.72761: entering _queue_task() for managed_node2/shell 30583 1726853769.73017: worker is 1 (out of 1 available) 30583 1726853769.73031: exiting _queue_task() for managed_node2/shell 30583 1726853769.73044: done queuing things up, now waiting for results queue to drain 30583 1726853769.73045: waiting for pending results... 30583 1726853769.73234: running TaskExecutor() for managed_node2/TASK: Get NM profile info 30583 1726853769.73314: in run() - task 02083763-bbaf-05ea-abc5-000000001f1a 30583 1726853769.73326: variable 'ansible_search_path' from source: unknown 30583 1726853769.73329: variable 'ansible_search_path' from source: unknown 30583 1726853769.73357: calling self._execute() 30583 1726853769.73438: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853769.73442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853769.73451: variable 'omit' from source: magic vars 30583 1726853769.73891: variable 'ansible_distribution_major_version' from source: facts 30583 1726853769.73894: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853769.73897: variable 'omit' from source: magic vars 30583 1726853769.73900: variable 'omit' from source: magic vars 30583 1726853769.73945: variable 'profile' from source: play vars 30583 1726853769.73949: variable 'interface' from source: play vars 30583 1726853769.74007: variable 'interface' from source: play vars 30583 1726853769.74025: variable 'omit' from source: magic vars 30583 1726853769.74064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853769.74104: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853769.74117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853769.74133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853769.74144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853769.74173: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853769.74176: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853769.74179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853769.74268: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853769.74275: Set connection var ansible_timeout to 10 30583 1726853769.74277: Set connection var ansible_connection to ssh 30583 1726853769.74283: Set connection var ansible_shell_executable to /bin/sh 30583 1726853769.74286: Set connection var ansible_shell_type to sh 30583 1726853769.74295: Set connection var ansible_pipelining to False 30583 1726853769.74327: variable 'ansible_shell_executable' from source: unknown 30583 1726853769.74330: variable 'ansible_connection' from source: unknown 30583 1726853769.74333: variable 'ansible_module_compression' from source: unknown 30583 1726853769.74335: variable 'ansible_shell_type' from source: unknown 30583 1726853769.74337: variable 'ansible_shell_executable' from source: unknown 30583 1726853769.74339: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853769.74341: variable 'ansible_pipelining' from source: unknown 30583 1726853769.74343: variable 'ansible_timeout' from source: unknown 30583 1726853769.74345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853769.74545: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853769.74549: variable 'omit' from source: magic vars 30583 1726853769.74551: starting attempt loop 30583 1726853769.74554: running the handler 30583 1726853769.74556: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853769.74561: _low_level_execute_command(): starting 30583 1726853769.74564: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853769.75166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853769.75180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853769.75198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853769.75205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853769.75217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853769.75223: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853769.75232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.75283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853769.75286: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853769.75289: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853769.75291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853769.75294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853769.75296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853769.75309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853769.75312: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853769.75315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.75368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853769.75390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853769.75392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853769.75495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853769.77228: stdout chunk (state=3): >>>/root <<< 30583 1726853769.77326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853769.77361: stderr chunk (state=3): >>><<< 30583 1726853769.77364: stdout chunk (state=3): >>><<< 30583 1726853769.77384: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853769.77395: _low_level_execute_command(): starting 30583 1726853769.77402: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050 `" && echo ansible-tmp-1726853769.7738335-35528-151593587586050="` echo /root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050 `" ) && sleep 0' 30583 1726853769.77841: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853769.77845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853769.77847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.77849: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853769.77852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.77910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853769.77912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853769.77980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853769.79980: stdout chunk (state=3): >>>ansible-tmp-1726853769.7738335-35528-151593587586050=/root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050 <<< 30583 1726853769.80081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853769.80113: stderr chunk (state=3): >>><<< 30583 1726853769.80117: stdout chunk (state=3): >>><<< 30583 1726853769.80135: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853769.7738335-35528-151593587586050=/root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853769.80165: variable 'ansible_module_compression' from source: unknown 30583 1726853769.80210: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853769.80243: variable 'ansible_facts' from source: unknown 30583 1726853769.80303: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050/AnsiballZ_command.py 30583 1726853769.80408: Sending initial data 30583 1726853769.80412: Sent initial data (156 bytes) 30583 1726853769.80841: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853769.80849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853769.80877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.80880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853769.80882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853769.80886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.80928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853769.80942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853769.81022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853769.82653: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30583 1726853769.82660: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853769.82722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853769.82789: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmppbqwaehk /root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050/AnsiballZ_command.py <<< 30583 1726853769.82795: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050/AnsiballZ_command.py" <<< 30583 1726853769.82863: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmppbqwaehk" to remote "/root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050/AnsiballZ_command.py" <<< 30583 1726853769.82866: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050/AnsiballZ_command.py" <<< 30583 1726853769.83541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853769.83588: stderr chunk (state=3): >>><<< 30583 1726853769.83595: stdout chunk (state=3): >>><<< 30583 1726853769.83637: done transferring module to remote 30583 1726853769.83646: _low_level_execute_command(): starting 30583 1726853769.83653: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050/ /root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050/AnsiballZ_command.py && sleep 0' 30583 1726853769.84261: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853769.84265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.84267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853769.84269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853769.84276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.84327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853769.84386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853769.86213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853769.86239: stderr chunk (state=3): >>><<< 30583 1726853769.86242: stdout chunk (state=3): >>><<< 30583 1726853769.86257: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853769.86261: _low_level_execute_command(): starting 30583 1726853769.86267: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050/AnsiballZ_command.py && sleep 0' 30583 1726853769.86746: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853769.86750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853769.86752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853769.86756: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853769.86774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853769.86799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853769.86851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853769.86938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853770.04277: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 13:36:10.025086", "end": "2024-09-20 13:36:10.041553", "delta": "0:00:00.016467", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853770.05843: stderr chunk (state=3): >>>debug2: Received exit status from master 1 <<< 30583 1726853770.05867: stderr chunk (state=3): >>>Shared connection to 10.31.9.197 closed. <<< 30583 1726853770.05939: stderr chunk (state=3): >>><<< 30583 1726853770.05942: stdout chunk (state=3): >>><<< 30583 1726853770.06022: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 13:36:10.025086", "end": "2024-09-20 13:36:10.041553", "delta": "0:00:00.016467", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. 30583 1726853770.06027: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853770.06034: _low_level_execute_command(): starting 30583 1726853770.06037: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853769.7738335-35528-151593587586050/ > /dev/null 2>&1 && sleep 0' 30583 1726853770.06633: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853770.06643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853770.06680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853770.06684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853770.06689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853770.06705: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853770.06833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853770.07010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853770.07073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853770.09277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853770.09281: stdout chunk (state=3): >>><<< 30583 1726853770.09283: stderr chunk (state=3): >>><<< 30583 1726853770.09285: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853770.09287: handler run complete 30583 1726853770.09288: Evaluated conditional (False): False 30583 1726853770.09290: attempt loop complete, returning result 30583 1726853770.09291: _execute() done 30583 1726853770.09293: dumping result to json 30583 1726853770.09294: done dumping result, returning 30583 1726853770.09296: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [02083763-bbaf-05ea-abc5-000000001f1a] 30583 1726853770.09297: sending task result for task 02083763-bbaf-05ea-abc5-000000001f1a 30583 1726853770.09366: done sending task result for task 02083763-bbaf-05ea-abc5-000000001f1a 30583 1726853770.09369: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.016467", "end": "2024-09-20 13:36:10.041553", "rc": 1, "start": "2024-09-20 13:36:10.025086" } MSG: non-zero return code ...ignoring 30583 1726853770.09451: no more pending results, returning what we have 30583 1726853770.09455: results queue empty 30583 1726853770.09456: checking for any_errors_fatal 30583 1726853770.09463: done checking for any_errors_fatal 30583 1726853770.09464: checking for max_fail_percentage 30583 1726853770.09466: done checking for max_fail_percentage 30583 1726853770.09467: checking to see if all hosts have failed and the running result is not ok 30583 1726853770.09468: done checking to see if all hosts have failed 30583 1726853770.09469: getting the remaining hosts for this loop 30583 1726853770.09476: done getting the remaining hosts for this loop 30583 1726853770.09484: getting the next task for host managed_node2 30583 1726853770.09494: done getting next task for host managed_node2 30583 1726853770.09497: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30583 1726853770.09502: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853770.09508: getting variables 30583 1726853770.09510: in VariableManager get_vars() 30583 1726853770.09555: Calling all_inventory to load vars for managed_node2 30583 1726853770.09558: Calling groups_inventory to load vars for managed_node2 30583 1726853770.09562: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853770.09679: Calling all_plugins_play to load vars for managed_node2 30583 1726853770.09684: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853770.09688: Calling groups_plugins_play to load vars for managed_node2 30583 1726853770.11432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853770.13854: done with get_vars() 30583 1726853770.14006: done getting variables 30583 1726853770.14070: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:36:10 -0400 (0:00:00.414) 0:01:45.479 ****** 30583 1726853770.14214: entering _queue_task() for managed_node2/set_fact 30583 1726853770.14962: worker is 1 (out of 1 available) 30583 1726853770.14976: exiting _queue_task() for managed_node2/set_fact 30583 1726853770.15076: done queuing things up, now waiting for results queue to drain 30583 1726853770.15078: waiting for pending results... 30583 1726853770.15490: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30583 1726853770.15618: in run() - task 02083763-bbaf-05ea-abc5-000000001f1b 30583 1726853770.15623: variable 'ansible_search_path' from source: unknown 30583 1726853770.15626: variable 'ansible_search_path' from source: unknown 30583 1726853770.15650: calling self._execute() 30583 1726853770.15754: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853770.15770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853770.15789: variable 'omit' from source: magic vars 30583 1726853770.16274: variable 'ansible_distribution_major_version' from source: facts 30583 1726853770.16279: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853770.16346: variable 'nm_profile_exists' from source: set_fact 30583 1726853770.16367: Evaluated conditional (nm_profile_exists.rc == 0): False 30583 1726853770.16380: when evaluation is False, skipping this task 30583 1726853770.16388: _execute() done 30583 1726853770.16396: dumping result to json 30583 1726853770.16403: done dumping result, returning 30583 1726853770.16443: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-05ea-abc5-000000001f1b] 30583 1726853770.16453: sending task result for task 02083763-bbaf-05ea-abc5-000000001f1b 30583 1726853770.16728: done sending task result for task 02083763-bbaf-05ea-abc5-000000001f1b 30583 1726853770.16732: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30583 1726853770.16784: no more pending results, returning what we have 30583 1726853770.16789: results queue empty 30583 1726853770.16790: checking for any_errors_fatal 30583 1726853770.16801: done checking for any_errors_fatal 30583 1726853770.16801: checking for max_fail_percentage 30583 1726853770.16803: done checking for max_fail_percentage 30583 1726853770.16808: checking to see if all hosts have failed and the running result is not ok 30583 1726853770.16809: done checking to see if all hosts have failed 30583 1726853770.16810: getting the remaining hosts for this loop 30583 1726853770.16812: done getting the remaining hosts for this loop 30583 1726853770.16815: getting the next task for host managed_node2 30583 1726853770.16826: done getting next task for host managed_node2 30583 1726853770.16829: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30583 1726853770.16834: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853770.16839: getting variables 30583 1726853770.16841: in VariableManager get_vars() 30583 1726853770.16923: Calling all_inventory to load vars for managed_node2 30583 1726853770.16926: Calling groups_inventory to load vars for managed_node2 30583 1726853770.16930: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853770.16940: Calling all_plugins_play to load vars for managed_node2 30583 1726853770.16943: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853770.16945: Calling groups_plugins_play to load vars for managed_node2 30583 1726853770.18351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853770.21091: done with get_vars() 30583 1726853770.21116: done getting variables 30583 1726853770.21187: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853770.21310: variable 'profile' from source: play vars 30583 1726853770.21314: variable 'interface' from source: play vars 30583 1726853770.21378: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:36:10 -0400 (0:00:00.071) 0:01:45.551 ****** 30583 1726853770.21410: entering _queue_task() for managed_node2/command 30583 1726853770.21803: worker is 1 (out of 1 available) 30583 1726853770.21816: exiting _queue_task() for managed_node2/command 30583 1726853770.21827: done queuing things up, now waiting for results queue to drain 30583 1726853770.21829: waiting for pending results... 30583 1726853770.22092: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 30583 1726853770.22200: in run() - task 02083763-bbaf-05ea-abc5-000000001f1d 30583 1726853770.22376: variable 'ansible_search_path' from source: unknown 30583 1726853770.22380: variable 'ansible_search_path' from source: unknown 30583 1726853770.22385: calling self._execute() 30583 1726853770.22391: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853770.22393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853770.22422: variable 'omit' from source: magic vars 30583 1726853770.22877: variable 'ansible_distribution_major_version' from source: facts 30583 1726853770.22895: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853770.23029: variable 'profile_stat' from source: set_fact 30583 1726853770.23047: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853770.23054: when evaluation is False, skipping this task 30583 1726853770.23064: _execute() done 30583 1726853770.23078: dumping result to json 30583 1726853770.23085: done dumping result, returning 30583 1726853770.23102: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000001f1d] 30583 1726853770.23111: sending task result for task 02083763-bbaf-05ea-abc5-000000001f1d skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853770.23377: no more pending results, returning what we have 30583 1726853770.23381: results queue empty 30583 1726853770.23382: checking for any_errors_fatal 30583 1726853770.23390: done checking for any_errors_fatal 30583 1726853770.23391: checking for max_fail_percentage 30583 1726853770.23394: done checking for max_fail_percentage 30583 1726853770.23396: checking to see if all hosts have failed and the running result is not ok 30583 1726853770.23396: done checking to see if all hosts have failed 30583 1726853770.23397: getting the remaining hosts for this loop 30583 1726853770.23399: done getting the remaining hosts for this loop 30583 1726853770.23404: getting the next task for host managed_node2 30583 1726853770.23413: done getting next task for host managed_node2 30583 1726853770.23416: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30583 1726853770.23422: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853770.23430: getting variables 30583 1726853770.23431: in VariableManager get_vars() 30583 1726853770.23595: Calling all_inventory to load vars for managed_node2 30583 1726853770.23598: Calling groups_inventory to load vars for managed_node2 30583 1726853770.23602: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853770.23615: Calling all_plugins_play to load vars for managed_node2 30583 1726853770.23618: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853770.23621: Calling groups_plugins_play to load vars for managed_node2 30583 1726853770.24287: done sending task result for task 02083763-bbaf-05ea-abc5-000000001f1d 30583 1726853770.24291: WORKER PROCESS EXITING 30583 1726853770.25239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853770.26902: done with get_vars() 30583 1726853770.26933: done getting variables 30583 1726853770.27004: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853770.27122: variable 'profile' from source: play vars 30583 1726853770.27127: variable 'interface' from source: play vars 30583 1726853770.27192: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:36:10 -0400 (0:00:00.058) 0:01:45.609 ****** 30583 1726853770.27225: entering _queue_task() for managed_node2/set_fact 30583 1726853770.27787: worker is 1 (out of 1 available) 30583 1726853770.27798: exiting _queue_task() for managed_node2/set_fact 30583 1726853770.27808: done queuing things up, now waiting for results queue to drain 30583 1726853770.27809: waiting for pending results... 30583 1726853770.27967: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 30583 1726853770.28116: in run() - task 02083763-bbaf-05ea-abc5-000000001f1e 30583 1726853770.28140: variable 'ansible_search_path' from source: unknown 30583 1726853770.28261: variable 'ansible_search_path' from source: unknown 30583 1726853770.28265: calling self._execute() 30583 1726853770.28309: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853770.28321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853770.28336: variable 'omit' from source: magic vars 30583 1726853770.28728: variable 'ansible_distribution_major_version' from source: facts 30583 1726853770.28746: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853770.28882: variable 'profile_stat' from source: set_fact 30583 1726853770.28900: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853770.28915: when evaluation is False, skipping this task 30583 1726853770.28924: _execute() done 30583 1726853770.28932: dumping result to json 30583 1726853770.28940: done dumping result, returning 30583 1726853770.28952: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000001f1e] 30583 1726853770.28965: sending task result for task 02083763-bbaf-05ea-abc5-000000001f1e skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853770.29120: no more pending results, returning what we have 30583 1726853770.29236: results queue empty 30583 1726853770.29238: checking for any_errors_fatal 30583 1726853770.29249: done checking for any_errors_fatal 30583 1726853770.29250: checking for max_fail_percentage 30583 1726853770.29253: done checking for max_fail_percentage 30583 1726853770.29254: checking to see if all hosts have failed and the running result is not ok 30583 1726853770.29255: done checking to see if all hosts have failed 30583 1726853770.29256: getting the remaining hosts for this loop 30583 1726853770.29260: done getting the remaining hosts for this loop 30583 1726853770.29265: getting the next task for host managed_node2 30583 1726853770.29277: done getting next task for host managed_node2 30583 1726853770.29281: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30583 1726853770.29286: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853770.29291: getting variables 30583 1726853770.29294: in VariableManager get_vars() 30583 1726853770.29339: Calling all_inventory to load vars for managed_node2 30583 1726853770.29342: Calling groups_inventory to load vars for managed_node2 30583 1726853770.29347: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853770.29363: Calling all_plugins_play to load vars for managed_node2 30583 1726853770.29368: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853770.29378: done sending task result for task 02083763-bbaf-05ea-abc5-000000001f1e 30583 1726853770.29382: WORKER PROCESS EXITING 30583 1726853770.29489: Calling groups_plugins_play to load vars for managed_node2 30583 1726853770.31322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853770.32917: done with get_vars() 30583 1726853770.32950: done getting variables 30583 1726853770.33014: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853770.33125: variable 'profile' from source: play vars 30583 1726853770.33130: variable 'interface' from source: play vars 30583 1726853770.33197: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:36:10 -0400 (0:00:00.060) 0:01:45.669 ****** 30583 1726853770.33232: entering _queue_task() for managed_node2/command 30583 1726853770.33793: worker is 1 (out of 1 available) 30583 1726853770.33805: exiting _queue_task() for managed_node2/command 30583 1726853770.33814: done queuing things up, now waiting for results queue to drain 30583 1726853770.33816: waiting for pending results... 30583 1726853770.33976: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 30583 1726853770.34116: in run() - task 02083763-bbaf-05ea-abc5-000000001f1f 30583 1726853770.34137: variable 'ansible_search_path' from source: unknown 30583 1726853770.34150: variable 'ansible_search_path' from source: unknown 30583 1726853770.34196: calling self._execute() 30583 1726853770.34315: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853770.34325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853770.34370: variable 'omit' from source: magic vars 30583 1726853770.34736: variable 'ansible_distribution_major_version' from source: facts 30583 1726853770.34753: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853770.34893: variable 'profile_stat' from source: set_fact 30583 1726853770.34919: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853770.35020: when evaluation is False, skipping this task 30583 1726853770.35025: _execute() done 30583 1726853770.35027: dumping result to json 30583 1726853770.35030: done dumping result, returning 30583 1726853770.35032: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000001f1f] 30583 1726853770.35036: sending task result for task 02083763-bbaf-05ea-abc5-000000001f1f 30583 1726853770.35107: done sending task result for task 02083763-bbaf-05ea-abc5-000000001f1f 30583 1726853770.35110: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853770.35166: no more pending results, returning what we have 30583 1726853770.35172: results queue empty 30583 1726853770.35173: checking for any_errors_fatal 30583 1726853770.35181: done checking for any_errors_fatal 30583 1726853770.35182: checking for max_fail_percentage 30583 1726853770.35184: done checking for max_fail_percentage 30583 1726853770.35185: checking to see if all hosts have failed and the running result is not ok 30583 1726853770.35186: done checking to see if all hosts have failed 30583 1726853770.35187: getting the remaining hosts for this loop 30583 1726853770.35189: done getting the remaining hosts for this loop 30583 1726853770.35192: getting the next task for host managed_node2 30583 1726853770.35203: done getting next task for host managed_node2 30583 1726853770.35206: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30583 1726853770.35212: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853770.35217: getting variables 30583 1726853770.35219: in VariableManager get_vars() 30583 1726853770.35270: Calling all_inventory to load vars for managed_node2 30583 1726853770.35376: Calling groups_inventory to load vars for managed_node2 30583 1726853770.35384: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853770.35398: Calling all_plugins_play to load vars for managed_node2 30583 1726853770.35402: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853770.35405: Calling groups_plugins_play to load vars for managed_node2 30583 1726853770.37054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853770.38699: done with get_vars() 30583 1726853770.38729: done getting variables 30583 1726853770.38800: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853770.38916: variable 'profile' from source: play vars 30583 1726853770.38920: variable 'interface' from source: play vars 30583 1726853770.38985: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:36:10 -0400 (0:00:00.057) 0:01:45.727 ****** 30583 1726853770.39020: entering _queue_task() for managed_node2/set_fact 30583 1726853770.39477: worker is 1 (out of 1 available) 30583 1726853770.39489: exiting _queue_task() for managed_node2/set_fact 30583 1726853770.39500: done queuing things up, now waiting for results queue to drain 30583 1726853770.39501: waiting for pending results... 30583 1726853770.39861: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 30583 1726853770.39961: in run() - task 02083763-bbaf-05ea-abc5-000000001f20 30583 1726853770.39965: variable 'ansible_search_path' from source: unknown 30583 1726853770.39969: variable 'ansible_search_path' from source: unknown 30583 1726853770.39983: calling self._execute() 30583 1726853770.40097: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853770.40110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853770.40128: variable 'omit' from source: magic vars 30583 1726853770.40553: variable 'ansible_distribution_major_version' from source: facts 30583 1726853770.40556: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853770.40695: variable 'profile_stat' from source: set_fact 30583 1726853770.40719: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853770.40722: when evaluation is False, skipping this task 30583 1726853770.40770: _execute() done 30583 1726853770.40775: dumping result to json 30583 1726853770.40777: done dumping result, returning 30583 1726853770.40780: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000001f20] 30583 1726853770.40782: sending task result for task 02083763-bbaf-05ea-abc5-000000001f20 30583 1726853770.40973: done sending task result for task 02083763-bbaf-05ea-abc5-000000001f20 30583 1726853770.40977: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853770.41026: no more pending results, returning what we have 30583 1726853770.41030: results queue empty 30583 1726853770.41032: checking for any_errors_fatal 30583 1726853770.41047: done checking for any_errors_fatal 30583 1726853770.41048: checking for max_fail_percentage 30583 1726853770.41051: done checking for max_fail_percentage 30583 1726853770.41052: checking to see if all hosts have failed and the running result is not ok 30583 1726853770.41053: done checking to see if all hosts have failed 30583 1726853770.41053: getting the remaining hosts for this loop 30583 1726853770.41056: done getting the remaining hosts for this loop 30583 1726853770.41062: getting the next task for host managed_node2 30583 1726853770.41076: done getting next task for host managed_node2 30583 1726853770.41080: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30583 1726853770.41084: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853770.41090: getting variables 30583 1726853770.41092: in VariableManager get_vars() 30583 1726853770.41137: Calling all_inventory to load vars for managed_node2 30583 1726853770.41140: Calling groups_inventory to load vars for managed_node2 30583 1726853770.41144: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853770.41284: Calling all_plugins_play to load vars for managed_node2 30583 1726853770.41288: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853770.41291: Calling groups_plugins_play to load vars for managed_node2 30583 1726853770.49525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853770.52488: done with get_vars() 30583 1726853770.52522: done getting variables 30583 1726853770.52579: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853770.52713: variable 'profile' from source: play vars 30583 1726853770.52716: variable 'interface' from source: play vars 30583 1726853770.52786: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 13:36:10 -0400 (0:00:00.137) 0:01:45.865 ****** 30583 1726853770.52820: entering _queue_task() for managed_node2/assert 30583 1726853770.53230: worker is 1 (out of 1 available) 30583 1726853770.53243: exiting _queue_task() for managed_node2/assert 30583 1726853770.53370: done queuing things up, now waiting for results queue to drain 30583 1726853770.53373: waiting for pending results... 30583 1726853770.53590: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' 30583 1726853770.53737: in run() - task 02083763-bbaf-05ea-abc5-000000001e9a 30583 1726853770.53756: variable 'ansible_search_path' from source: unknown 30583 1726853770.53768: variable 'ansible_search_path' from source: unknown 30583 1726853770.53817: calling self._execute() 30583 1726853770.53931: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853770.53944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853770.53961: variable 'omit' from source: magic vars 30583 1726853770.54388: variable 'ansible_distribution_major_version' from source: facts 30583 1726853770.54407: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853770.54420: variable 'omit' from source: magic vars 30583 1726853770.54555: variable 'omit' from source: magic vars 30583 1726853770.54598: variable 'profile' from source: play vars 30583 1726853770.54602: variable 'interface' from source: play vars 30583 1726853770.54725: variable 'interface' from source: play vars 30583 1726853770.54781: variable 'omit' from source: magic vars 30583 1726853770.54865: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853770.54907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853770.54929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853770.54980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853770.54991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853770.55077: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853770.55080: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853770.55082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853770.55326: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853770.55330: Set connection var ansible_timeout to 10 30583 1726853770.55333: Set connection var ansible_connection to ssh 30583 1726853770.55335: Set connection var ansible_shell_executable to /bin/sh 30583 1726853770.55338: Set connection var ansible_shell_type to sh 30583 1726853770.55340: Set connection var ansible_pipelining to False 30583 1726853770.55342: variable 'ansible_shell_executable' from source: unknown 30583 1726853770.55344: variable 'ansible_connection' from source: unknown 30583 1726853770.55346: variable 'ansible_module_compression' from source: unknown 30583 1726853770.55348: variable 'ansible_shell_type' from source: unknown 30583 1726853770.55350: variable 'ansible_shell_executable' from source: unknown 30583 1726853770.55352: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853770.55354: variable 'ansible_pipelining' from source: unknown 30583 1726853770.55357: variable 'ansible_timeout' from source: unknown 30583 1726853770.55361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853770.55478: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853770.55482: variable 'omit' from source: magic vars 30583 1726853770.55485: starting attempt loop 30583 1726853770.55487: running the handler 30583 1726853770.55604: variable 'lsr_net_profile_exists' from source: set_fact 30583 1726853770.55607: Evaluated conditional (not lsr_net_profile_exists): True 30583 1726853770.55609: handler run complete 30583 1726853770.55611: attempt loop complete, returning result 30583 1726853770.55614: _execute() done 30583 1726853770.55617: dumping result to json 30583 1726853770.55619: done dumping result, returning 30583 1726853770.55621: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' [02083763-bbaf-05ea-abc5-000000001e9a] 30583 1726853770.55623: sending task result for task 02083763-bbaf-05ea-abc5-000000001e9a ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853770.55847: no more pending results, returning what we have 30583 1726853770.55850: results queue empty 30583 1726853770.55851: checking for any_errors_fatal 30583 1726853770.55865: done checking for any_errors_fatal 30583 1726853770.55866: checking for max_fail_percentage 30583 1726853770.55868: done checking for max_fail_percentage 30583 1726853770.55869: checking to see if all hosts have failed and the running result is not ok 30583 1726853770.55870: done checking to see if all hosts have failed 30583 1726853770.55872: getting the remaining hosts for this loop 30583 1726853770.55874: done getting the remaining hosts for this loop 30583 1726853770.55877: getting the next task for host managed_node2 30583 1726853770.55888: done getting next task for host managed_node2 30583 1726853770.55892: ^ task is: TASK: Conditional asserts 30583 1726853770.55894: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853770.55900: getting variables 30583 1726853770.55901: in VariableManager get_vars() 30583 1726853770.56098: Calling all_inventory to load vars for managed_node2 30583 1726853770.56102: Calling groups_inventory to load vars for managed_node2 30583 1726853770.56105: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853770.56113: Calling all_plugins_play to load vars for managed_node2 30583 1726853770.56115: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853770.56118: Calling groups_plugins_play to load vars for managed_node2 30583 1726853770.56642: done sending task result for task 02083763-bbaf-05ea-abc5-000000001e9a 30583 1726853770.56646: WORKER PROCESS EXITING 30583 1726853770.57419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853770.58941: done with get_vars() 30583 1726853770.58965: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 13:36:10 -0400 (0:00:00.062) 0:01:45.927 ****** 30583 1726853770.59070: entering _queue_task() for managed_node2/include_tasks 30583 1726853770.59601: worker is 1 (out of 1 available) 30583 1726853770.59613: exiting _queue_task() for managed_node2/include_tasks 30583 1726853770.59626: done queuing things up, now waiting for results queue to drain 30583 1726853770.59628: waiting for pending results... 30583 1726853770.59917: running TaskExecutor() for managed_node2/TASK: Conditional asserts 30583 1726853770.60098: in run() - task 02083763-bbaf-05ea-abc5-00000000174a 30583 1726853770.60103: variable 'ansible_search_path' from source: unknown 30583 1726853770.60105: variable 'ansible_search_path' from source: unknown 30583 1726853770.60403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853770.62744: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853770.62796: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853770.62825: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853770.62849: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853770.62874: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853770.62935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853770.62956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853770.62980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853770.63006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853770.63017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853770.63104: variable 'lsr_assert_when' from source: include params 30583 1726853770.63185: variable 'network_provider' from source: set_fact 30583 1726853770.63238: variable 'omit' from source: magic vars 30583 1726853770.63329: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853770.63338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853770.63346: variable 'omit' from source: magic vars 30583 1726853770.63482: variable 'ansible_distribution_major_version' from source: facts 30583 1726853770.63489: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853770.63567: variable 'item' from source: unknown 30583 1726853770.63574: Evaluated conditional (item['condition']): True 30583 1726853770.63630: variable 'item' from source: unknown 30583 1726853770.63653: variable 'item' from source: unknown 30583 1726853770.63702: variable 'item' from source: unknown 30583 1726853770.63834: dumping result to json 30583 1726853770.63838: done dumping result, returning 30583 1726853770.63840: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [02083763-bbaf-05ea-abc5-00000000174a] 30583 1726853770.63842: sending task result for task 02083763-bbaf-05ea-abc5-00000000174a 30583 1726853770.63881: done sending task result for task 02083763-bbaf-05ea-abc5-00000000174a 30583 1726853770.63883: WORKER PROCESS EXITING 30583 1726853770.63904: no more pending results, returning what we have 30583 1726853770.63909: in VariableManager get_vars() 30583 1726853770.63962: Calling all_inventory to load vars for managed_node2 30583 1726853770.63965: Calling groups_inventory to load vars for managed_node2 30583 1726853770.63968: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853770.63980: Calling all_plugins_play to load vars for managed_node2 30583 1726853770.63983: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853770.63986: Calling groups_plugins_play to load vars for managed_node2 30583 1726853770.64954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853770.65817: done with get_vars() 30583 1726853770.65830: variable 'ansible_search_path' from source: unknown 30583 1726853770.65831: variable 'ansible_search_path' from source: unknown 30583 1726853770.65859: we have included files to process 30583 1726853770.65860: generating all_blocks data 30583 1726853770.65862: done generating all_blocks data 30583 1726853770.65865: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853770.65866: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853770.65867: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853770.65942: in VariableManager get_vars() 30583 1726853770.65956: done with get_vars() 30583 1726853770.66035: done processing included file 30583 1726853770.66037: iterating over new_blocks loaded from include file 30583 1726853770.66038: in VariableManager get_vars() 30583 1726853770.66049: done with get_vars() 30583 1726853770.66050: filtering new block on tags 30583 1726853770.66073: done filtering new block on tags 30583 1726853770.66075: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 30583 1726853770.66079: extending task lists for all hosts with included blocks 30583 1726853770.66745: done extending task lists 30583 1726853770.66746: done processing included files 30583 1726853770.66746: results queue empty 30583 1726853770.66747: checking for any_errors_fatal 30583 1726853770.66749: done checking for any_errors_fatal 30583 1726853770.66749: checking for max_fail_percentage 30583 1726853770.66750: done checking for max_fail_percentage 30583 1726853770.66751: checking to see if all hosts have failed and the running result is not ok 30583 1726853770.66751: done checking to see if all hosts have failed 30583 1726853770.66752: getting the remaining hosts for this loop 30583 1726853770.66753: done getting the remaining hosts for this loop 30583 1726853770.66754: getting the next task for host managed_node2 30583 1726853770.66757: done getting next task for host managed_node2 30583 1726853770.66759: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30583 1726853770.66762: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853770.66769: getting variables 30583 1726853770.66770: in VariableManager get_vars() 30583 1726853770.66781: Calling all_inventory to load vars for managed_node2 30583 1726853770.66783: Calling groups_inventory to load vars for managed_node2 30583 1726853770.66785: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853770.66790: Calling all_plugins_play to load vars for managed_node2 30583 1726853770.66791: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853770.66793: Calling groups_plugins_play to load vars for managed_node2 30583 1726853770.67446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853770.68350: done with get_vars() 30583 1726853770.68365: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:36:10 -0400 (0:00:00.093) 0:01:46.021 ****** 30583 1726853770.68421: entering _queue_task() for managed_node2/include_tasks 30583 1726853770.68693: worker is 1 (out of 1 available) 30583 1726853770.68707: exiting _queue_task() for managed_node2/include_tasks 30583 1726853770.68719: done queuing things up, now waiting for results queue to drain 30583 1726853770.68721: waiting for pending results... 30583 1726853770.68916: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30583 1726853770.69012: in run() - task 02083763-bbaf-05ea-abc5-000000001f59 30583 1726853770.69022: variable 'ansible_search_path' from source: unknown 30583 1726853770.69025: variable 'ansible_search_path' from source: unknown 30583 1726853770.69060: calling self._execute() 30583 1726853770.69138: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853770.69141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853770.69149: variable 'omit' from source: magic vars 30583 1726853770.69452: variable 'ansible_distribution_major_version' from source: facts 30583 1726853770.69464: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853770.69469: _execute() done 30583 1726853770.69474: dumping result to json 30583 1726853770.69476: done dumping result, returning 30583 1726853770.69485: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-05ea-abc5-000000001f59] 30583 1726853770.69489: sending task result for task 02083763-bbaf-05ea-abc5-000000001f59 30583 1726853770.69575: done sending task result for task 02083763-bbaf-05ea-abc5-000000001f59 30583 1726853770.69578: WORKER PROCESS EXITING 30583 1726853770.69615: no more pending results, returning what we have 30583 1726853770.69620: in VariableManager get_vars() 30583 1726853770.69668: Calling all_inventory to load vars for managed_node2 30583 1726853770.69672: Calling groups_inventory to load vars for managed_node2 30583 1726853770.69676: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853770.69688: Calling all_plugins_play to load vars for managed_node2 30583 1726853770.69691: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853770.69694: Calling groups_plugins_play to load vars for managed_node2 30583 1726853770.70524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853770.71397: done with get_vars() 30583 1726853770.71415: variable 'ansible_search_path' from source: unknown 30583 1726853770.71416: variable 'ansible_search_path' from source: unknown 30583 1726853770.71528: variable 'item' from source: include params 30583 1726853770.71552: we have included files to process 30583 1726853770.71553: generating all_blocks data 30583 1726853770.71554: done generating all_blocks data 30583 1726853770.71556: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853770.71556: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853770.71558: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853770.71688: done processing included file 30583 1726853770.71690: iterating over new_blocks loaded from include file 30583 1726853770.71691: in VariableManager get_vars() 30583 1726853770.71703: done with get_vars() 30583 1726853770.71704: filtering new block on tags 30583 1726853770.71720: done filtering new block on tags 30583 1726853770.71722: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30583 1726853770.71727: extending task lists for all hosts with included blocks 30583 1726853770.71820: done extending task lists 30583 1726853770.71821: done processing included files 30583 1726853770.71821: results queue empty 30583 1726853770.71822: checking for any_errors_fatal 30583 1726853770.71824: done checking for any_errors_fatal 30583 1726853770.71825: checking for max_fail_percentage 30583 1726853770.71826: done checking for max_fail_percentage 30583 1726853770.71826: checking to see if all hosts have failed and the running result is not ok 30583 1726853770.71827: done checking to see if all hosts have failed 30583 1726853770.71827: getting the remaining hosts for this loop 30583 1726853770.71828: done getting the remaining hosts for this loop 30583 1726853770.71829: getting the next task for host managed_node2 30583 1726853770.71833: done getting next task for host managed_node2 30583 1726853770.71836: ^ task is: TASK: Get stat for interface {{ interface }} 30583 1726853770.71839: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853770.71841: getting variables 30583 1726853770.71842: in VariableManager get_vars() 30583 1726853770.71849: Calling all_inventory to load vars for managed_node2 30583 1726853770.71851: Calling groups_inventory to load vars for managed_node2 30583 1726853770.71852: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853770.71857: Calling all_plugins_play to load vars for managed_node2 30583 1726853770.71859: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853770.71861: Calling groups_plugins_play to load vars for managed_node2 30583 1726853770.72581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853770.73433: done with get_vars() 30583 1726853770.73451: done getting variables 30583 1726853770.73545: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:36:10 -0400 (0:00:00.051) 0:01:46.072 ****** 30583 1726853770.73568: entering _queue_task() for managed_node2/stat 30583 1726853770.73848: worker is 1 (out of 1 available) 30583 1726853770.73861: exiting _queue_task() for managed_node2/stat 30583 1726853770.73876: done queuing things up, now waiting for results queue to drain 30583 1726853770.73877: waiting for pending results... 30583 1726853770.74076: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30583 1726853770.74163: in run() - task 02083763-bbaf-05ea-abc5-000000001fe8 30583 1726853770.74178: variable 'ansible_search_path' from source: unknown 30583 1726853770.74182: variable 'ansible_search_path' from source: unknown 30583 1726853770.74215: calling self._execute() 30583 1726853770.74294: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853770.74297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853770.74306: variable 'omit' from source: magic vars 30583 1726853770.74601: variable 'ansible_distribution_major_version' from source: facts 30583 1726853770.74612: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853770.74618: variable 'omit' from source: magic vars 30583 1726853770.74652: variable 'omit' from source: magic vars 30583 1726853770.74729: variable 'interface' from source: play vars 30583 1726853770.74743: variable 'omit' from source: magic vars 30583 1726853770.74782: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853770.74809: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853770.74826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853770.74840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853770.74851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853770.74882: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853770.74885: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853770.74887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853770.74954: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853770.74959: Set connection var ansible_timeout to 10 30583 1726853770.74964: Set connection var ansible_connection to ssh 30583 1726853770.74970: Set connection var ansible_shell_executable to /bin/sh 30583 1726853770.74974: Set connection var ansible_shell_type to sh 30583 1726853770.74987: Set connection var ansible_pipelining to False 30583 1726853770.75002: variable 'ansible_shell_executable' from source: unknown 30583 1726853770.75005: variable 'ansible_connection' from source: unknown 30583 1726853770.75008: variable 'ansible_module_compression' from source: unknown 30583 1726853770.75010: variable 'ansible_shell_type' from source: unknown 30583 1726853770.75012: variable 'ansible_shell_executable' from source: unknown 30583 1726853770.75014: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853770.75017: variable 'ansible_pipelining' from source: unknown 30583 1726853770.75019: variable 'ansible_timeout' from source: unknown 30583 1726853770.75024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853770.75181: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853770.75191: variable 'omit' from source: magic vars 30583 1726853770.75198: starting attempt loop 30583 1726853770.75201: running the handler 30583 1726853770.75213: _low_level_execute_command(): starting 30583 1726853770.75220: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853770.75746: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853770.75750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853770.75754: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853770.75757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853770.75803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853770.75806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853770.75808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853770.75897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853770.77621: stdout chunk (state=3): >>>/root <<< 30583 1726853770.77724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853770.77754: stderr chunk (state=3): >>><<< 30583 1726853770.77757: stdout chunk (state=3): >>><<< 30583 1726853770.77782: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853770.77795: _low_level_execute_command(): starting 30583 1726853770.77801: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533 `" && echo ansible-tmp-1726853770.7778277-35573-134922468703533="` echo /root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533 `" ) && sleep 0' 30583 1726853770.78222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853770.78230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853770.78261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853770.78276: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853770.78278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853770.78320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853770.78327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853770.78330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853770.78396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853770.80482: stdout chunk (state=3): >>>ansible-tmp-1726853770.7778277-35573-134922468703533=/root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533 <<< 30583 1726853770.80617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853770.80641: stderr chunk (state=3): >>><<< 30583 1726853770.80644: stdout chunk (state=3): >>><<< 30583 1726853770.80659: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853770.7778277-35573-134922468703533=/root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853770.80703: variable 'ansible_module_compression' from source: unknown 30583 1726853770.80754: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853770.80793: variable 'ansible_facts' from source: unknown 30583 1726853770.80853: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533/AnsiballZ_stat.py 30583 1726853770.80966: Sending initial data 30583 1726853770.80969: Sent initial data (153 bytes) 30583 1726853770.81422: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853770.81426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853770.81428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853770.81430: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853770.81432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853770.81478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853770.81495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853770.81499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853770.81562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853770.83201: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853770.83268: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853770.83335: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp9ul3m597 /root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533/AnsiballZ_stat.py <<< 30583 1726853770.83343: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533/AnsiballZ_stat.py" <<< 30583 1726853770.83404: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp9ul3m597" to remote "/root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533/AnsiballZ_stat.py" <<< 30583 1726853770.83408: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533/AnsiballZ_stat.py" <<< 30583 1726853770.84083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853770.84125: stderr chunk (state=3): >>><<< 30583 1726853770.84128: stdout chunk (state=3): >>><<< 30583 1726853770.84167: done transferring module to remote 30583 1726853770.84180: _low_level_execute_command(): starting 30583 1726853770.84185: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533/ /root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533/AnsiballZ_stat.py && sleep 0' 30583 1726853770.84637: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853770.84642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853770.84645: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853770.84647: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853770.84653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853770.84701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853770.84705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853770.84709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853770.84780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853770.86652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853770.86680: stderr chunk (state=3): >>><<< 30583 1726853770.86683: stdout chunk (state=3): >>><<< 30583 1726853770.86699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853770.86702: _low_level_execute_command(): starting 30583 1726853770.86705: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533/AnsiballZ_stat.py && sleep 0' 30583 1726853770.87141: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853770.87144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853770.87147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853770.87148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853770.87200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853770.87204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853770.87211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853770.87288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853771.02892: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853771.04480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853771.04484: stdout chunk (state=3): >>><<< 30583 1726853771.04486: stderr chunk (state=3): >>><<< 30583 1726853771.04488: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853771.04491: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853771.04493: _low_level_execute_command(): starting 30583 1726853771.04495: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853770.7778277-35573-134922468703533/ > /dev/null 2>&1 && sleep 0' 30583 1726853771.05109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853771.05121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853771.05172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853771.05256: stderr chunk (state=3): >>>debug2: match found <<< 30583 1726853771.05285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853771.05309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853771.05419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853771.07401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853771.07404: stdout chunk (state=3): >>><<< 30583 1726853771.07412: stderr chunk (state=3): >>><<< 30583 1726853771.07523: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853771.07527: handler run complete 30583 1726853771.07529: attempt loop complete, returning result 30583 1726853771.07531: _execute() done 30583 1726853771.07533: dumping result to json 30583 1726853771.07535: done dumping result, returning 30583 1726853771.07537: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [02083763-bbaf-05ea-abc5-000000001fe8] 30583 1726853771.07539: sending task result for task 02083763-bbaf-05ea-abc5-000000001fe8 30583 1726853771.07616: done sending task result for task 02083763-bbaf-05ea-abc5-000000001fe8 30583 1726853771.07619: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30583 1726853771.07687: no more pending results, returning what we have 30583 1726853771.07691: results queue empty 30583 1726853771.07692: checking for any_errors_fatal 30583 1726853771.07694: done checking for any_errors_fatal 30583 1726853771.07695: checking for max_fail_percentage 30583 1726853771.07697: done checking for max_fail_percentage 30583 1726853771.07699: checking to see if all hosts have failed and the running result is not ok 30583 1726853771.07699: done checking to see if all hosts have failed 30583 1726853771.07700: getting the remaining hosts for this loop 30583 1726853771.07702: done getting the remaining hosts for this loop 30583 1726853771.07706: getting the next task for host managed_node2 30583 1726853771.07718: done getting next task for host managed_node2 30583 1726853771.07720: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30583 1726853771.07725: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853771.07730: getting variables 30583 1726853771.07732: in VariableManager get_vars() 30583 1726853771.07887: Calling all_inventory to load vars for managed_node2 30583 1726853771.07891: Calling groups_inventory to load vars for managed_node2 30583 1726853771.07895: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853771.07906: Calling all_plugins_play to load vars for managed_node2 30583 1726853771.07910: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853771.07913: Calling groups_plugins_play to load vars for managed_node2 30583 1726853771.09606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853771.11266: done with get_vars() 30583 1726853771.11294: done getting variables 30583 1726853771.11366: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853771.11501: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:36:11 -0400 (0:00:00.379) 0:01:46.452 ****** 30583 1726853771.11530: entering _queue_task() for managed_node2/assert 30583 1726853771.11933: worker is 1 (out of 1 available) 30583 1726853771.11948: exiting _queue_task() for managed_node2/assert 30583 1726853771.11962: done queuing things up, now waiting for results queue to drain 30583 1726853771.11963: waiting for pending results... 30583 1726853771.12350: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 30583 1726853771.12450: in run() - task 02083763-bbaf-05ea-abc5-000000001f5a 30583 1726853771.12453: variable 'ansible_search_path' from source: unknown 30583 1726853771.12456: variable 'ansible_search_path' from source: unknown 30583 1726853771.12468: calling self._execute() 30583 1726853771.12582: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.12592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.12606: variable 'omit' from source: magic vars 30583 1726853771.13031: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.13107: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.13110: variable 'omit' from source: magic vars 30583 1726853771.13123: variable 'omit' from source: magic vars 30583 1726853771.13234: variable 'interface' from source: play vars 30583 1726853771.13255: variable 'omit' from source: magic vars 30583 1726853771.13304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853771.13352: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853771.13383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853771.13405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.13420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.13541: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853771.13544: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.13547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.13589: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853771.13600: Set connection var ansible_timeout to 10 30583 1726853771.13606: Set connection var ansible_connection to ssh 30583 1726853771.13615: Set connection var ansible_shell_executable to /bin/sh 30583 1726853771.13621: Set connection var ansible_shell_type to sh 30583 1726853771.13633: Set connection var ansible_pipelining to False 30583 1726853771.13673: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.13682: variable 'ansible_connection' from source: unknown 30583 1726853771.13688: variable 'ansible_module_compression' from source: unknown 30583 1726853771.13694: variable 'ansible_shell_type' from source: unknown 30583 1726853771.13700: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.13706: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.13713: variable 'ansible_pipelining' from source: unknown 30583 1726853771.13719: variable 'ansible_timeout' from source: unknown 30583 1726853771.13726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.13875: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853771.13893: variable 'omit' from source: magic vars 30583 1726853771.13978: starting attempt loop 30583 1726853771.13982: running the handler 30583 1726853771.14065: variable 'interface_stat' from source: set_fact 30583 1726853771.14089: Evaluated conditional (not interface_stat.stat.exists): True 30583 1726853771.14098: handler run complete 30583 1726853771.14115: attempt loop complete, returning result 30583 1726853771.14121: _execute() done 30583 1726853771.14126: dumping result to json 30583 1726853771.14133: done dumping result, returning 30583 1726853771.14143: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [02083763-bbaf-05ea-abc5-000000001f5a] 30583 1726853771.14150: sending task result for task 02083763-bbaf-05ea-abc5-000000001f5a ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853771.14347: no more pending results, returning what we have 30583 1726853771.14351: results queue empty 30583 1726853771.14352: checking for any_errors_fatal 30583 1726853771.14365: done checking for any_errors_fatal 30583 1726853771.14365: checking for max_fail_percentage 30583 1726853771.14367: done checking for max_fail_percentage 30583 1726853771.14368: checking to see if all hosts have failed and the running result is not ok 30583 1726853771.14369: done checking to see if all hosts have failed 30583 1726853771.14370: getting the remaining hosts for this loop 30583 1726853771.14374: done getting the remaining hosts for this loop 30583 1726853771.14379: getting the next task for host managed_node2 30583 1726853771.14390: done getting next task for host managed_node2 30583 1726853771.14393: ^ task is: TASK: Success in test '{{ lsr_description }}' 30583 1726853771.14397: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853771.14401: getting variables 30583 1726853771.14405: in VariableManager get_vars() 30583 1726853771.14448: Calling all_inventory to load vars for managed_node2 30583 1726853771.14450: Calling groups_inventory to load vars for managed_node2 30583 1726853771.14454: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853771.14469: Calling all_plugins_play to load vars for managed_node2 30583 1726853771.14679: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853771.14684: Calling groups_plugins_play to load vars for managed_node2 30583 1726853771.15296: done sending task result for task 02083763-bbaf-05ea-abc5-000000001f5a 30583 1726853771.15299: WORKER PROCESS EXITING 30583 1726853771.16422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853771.17978: done with get_vars() 30583 1726853771.18002: done getting variables 30583 1726853771.18055: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853771.18143: variable 'lsr_description' from source: include params TASK [Success in test 'I can take a profile down that is absent'] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 13:36:11 -0400 (0:00:00.066) 0:01:46.518 ****** 30583 1726853771.18169: entering _queue_task() for managed_node2/debug 30583 1726853771.18439: worker is 1 (out of 1 available) 30583 1726853771.18453: exiting _queue_task() for managed_node2/debug 30583 1726853771.18467: done queuing things up, now waiting for results queue to drain 30583 1726853771.18469: waiting for pending results... 30583 1726853771.18666: running TaskExecutor() for managed_node2/TASK: Success in test 'I can take a profile down that is absent' 30583 1726853771.18741: in run() - task 02083763-bbaf-05ea-abc5-00000000174b 30583 1726853771.18752: variable 'ansible_search_path' from source: unknown 30583 1726853771.18755: variable 'ansible_search_path' from source: unknown 30583 1726853771.18788: calling self._execute() 30583 1726853771.18865: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.18869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.18880: variable 'omit' from source: magic vars 30583 1726853771.19167: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.19179: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.19185: variable 'omit' from source: magic vars 30583 1726853771.19210: variable 'omit' from source: magic vars 30583 1726853771.19283: variable 'lsr_description' from source: include params 30583 1726853771.19297: variable 'omit' from source: magic vars 30583 1726853771.19330: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853771.19362: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853771.19380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853771.19394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.19403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.19428: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853771.19431: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.19433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.19507: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853771.19510: Set connection var ansible_timeout to 10 30583 1726853771.19513: Set connection var ansible_connection to ssh 30583 1726853771.19520: Set connection var ansible_shell_executable to /bin/sh 30583 1726853771.19522: Set connection var ansible_shell_type to sh 30583 1726853771.19529: Set connection var ansible_pipelining to False 30583 1726853771.19592: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.19596: variable 'ansible_connection' from source: unknown 30583 1726853771.19599: variable 'ansible_module_compression' from source: unknown 30583 1726853771.19601: variable 'ansible_shell_type' from source: unknown 30583 1726853771.19603: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.19605: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.19607: variable 'ansible_pipelining' from source: unknown 30583 1726853771.19610: variable 'ansible_timeout' from source: unknown 30583 1726853771.19618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.19876: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853771.19880: variable 'omit' from source: magic vars 30583 1726853771.19882: starting attempt loop 30583 1726853771.19885: running the handler 30583 1726853771.19887: handler run complete 30583 1726853771.19889: attempt loop complete, returning result 30583 1726853771.19892: _execute() done 30583 1726853771.19894: dumping result to json 30583 1726853771.19896: done dumping result, returning 30583 1726853771.19898: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can take a profile down that is absent' [02083763-bbaf-05ea-abc5-00000000174b] 30583 1726853771.19899: sending task result for task 02083763-bbaf-05ea-abc5-00000000174b 30583 1726853771.19958: done sending task result for task 02083763-bbaf-05ea-abc5-00000000174b 30583 1726853771.19962: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'I can take a profile down that is absent' +++++ 30583 1726853771.20011: no more pending results, returning what we have 30583 1726853771.20014: results queue empty 30583 1726853771.20016: checking for any_errors_fatal 30583 1726853771.20024: done checking for any_errors_fatal 30583 1726853771.20024: checking for max_fail_percentage 30583 1726853771.20026: done checking for max_fail_percentage 30583 1726853771.20028: checking to see if all hosts have failed and the running result is not ok 30583 1726853771.20028: done checking to see if all hosts have failed 30583 1726853771.20029: getting the remaining hosts for this loop 30583 1726853771.20031: done getting the remaining hosts for this loop 30583 1726853771.20035: getting the next task for host managed_node2 30583 1726853771.20042: done getting next task for host managed_node2 30583 1726853771.20045: ^ task is: TASK: Cleanup 30583 1726853771.20048: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853771.20053: getting variables 30583 1726853771.20054: in VariableManager get_vars() 30583 1726853771.20097: Calling all_inventory to load vars for managed_node2 30583 1726853771.20100: Calling groups_inventory to load vars for managed_node2 30583 1726853771.20103: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853771.20115: Calling all_plugins_play to load vars for managed_node2 30583 1726853771.20118: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853771.20120: Calling groups_plugins_play to load vars for managed_node2 30583 1726853771.21316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853771.22331: done with get_vars() 30583 1726853771.22348: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 13:36:11 -0400 (0:00:00.042) 0:01:46.561 ****** 30583 1726853771.22423: entering _queue_task() for managed_node2/include_tasks 30583 1726853771.22696: worker is 1 (out of 1 available) 30583 1726853771.22709: exiting _queue_task() for managed_node2/include_tasks 30583 1726853771.22721: done queuing things up, now waiting for results queue to drain 30583 1726853771.22722: waiting for pending results... 30583 1726853771.23088: running TaskExecutor() for managed_node2/TASK: Cleanup 30583 1726853771.23093: in run() - task 02083763-bbaf-05ea-abc5-00000000174f 30583 1726853771.23096: variable 'ansible_search_path' from source: unknown 30583 1726853771.23098: variable 'ansible_search_path' from source: unknown 30583 1726853771.23137: variable 'lsr_cleanup' from source: include params 30583 1726853771.23362: variable 'lsr_cleanup' from source: include params 30583 1726853771.23439: variable 'omit' from source: magic vars 30583 1726853771.23586: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.23607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.23622: variable 'omit' from source: magic vars 30583 1726853771.23877: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.23893: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.23905: variable 'item' from source: unknown 30583 1726853771.23969: variable 'item' from source: unknown 30583 1726853771.24007: variable 'item' from source: unknown 30583 1726853771.24068: variable 'item' from source: unknown 30583 1726853771.24476: dumping result to json 30583 1726853771.24479: done dumping result, returning 30583 1726853771.24482: done running TaskExecutor() for managed_node2/TASK: Cleanup [02083763-bbaf-05ea-abc5-00000000174f] 30583 1726853771.24484: sending task result for task 02083763-bbaf-05ea-abc5-00000000174f 30583 1726853771.24528: done sending task result for task 02083763-bbaf-05ea-abc5-00000000174f 30583 1726853771.24549: no more pending results, returning what we have 30583 1726853771.24554: in VariableManager get_vars() 30583 1726853771.24596: Calling all_inventory to load vars for managed_node2 30583 1726853771.24599: Calling groups_inventory to load vars for managed_node2 30583 1726853771.24602: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853771.24609: WORKER PROCESS EXITING 30583 1726853771.24673: Calling all_plugins_play to load vars for managed_node2 30583 1726853771.24677: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853771.24681: Calling groups_plugins_play to load vars for managed_node2 30583 1726853771.25986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853771.27534: done with get_vars() 30583 1726853771.27560: variable 'ansible_search_path' from source: unknown 30583 1726853771.27562: variable 'ansible_search_path' from source: unknown 30583 1726853771.27604: we have included files to process 30583 1726853771.27606: generating all_blocks data 30583 1726853771.27607: done generating all_blocks data 30583 1726853771.27611: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853771.27612: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853771.27615: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853771.27811: done processing included file 30583 1726853771.27813: iterating over new_blocks loaded from include file 30583 1726853771.27815: in VariableManager get_vars() 30583 1726853771.27832: done with get_vars() 30583 1726853771.27835: filtering new block on tags 30583 1726853771.27860: done filtering new block on tags 30583 1726853771.27863: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 30583 1726853771.27868: extending task lists for all hosts with included blocks 30583 1726853771.29191: done extending task lists 30583 1726853771.29193: done processing included files 30583 1726853771.29193: results queue empty 30583 1726853771.29194: checking for any_errors_fatal 30583 1726853771.29198: done checking for any_errors_fatal 30583 1726853771.29198: checking for max_fail_percentage 30583 1726853771.29200: done checking for max_fail_percentage 30583 1726853771.29201: checking to see if all hosts have failed and the running result is not ok 30583 1726853771.29201: done checking to see if all hosts have failed 30583 1726853771.29202: getting the remaining hosts for this loop 30583 1726853771.29204: done getting the remaining hosts for this loop 30583 1726853771.29207: getting the next task for host managed_node2 30583 1726853771.29211: done getting next task for host managed_node2 30583 1726853771.29214: ^ task is: TASK: Cleanup profile and device 30583 1726853771.29216: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853771.29219: getting variables 30583 1726853771.29220: in VariableManager get_vars() 30583 1726853771.29234: Calling all_inventory to load vars for managed_node2 30583 1726853771.29236: Calling groups_inventory to load vars for managed_node2 30583 1726853771.29239: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853771.29245: Calling all_plugins_play to load vars for managed_node2 30583 1726853771.29247: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853771.29250: Calling groups_plugins_play to load vars for managed_node2 30583 1726853771.30507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853771.32037: done with get_vars() 30583 1726853771.32070: done getting variables 30583 1726853771.32121: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 13:36:11 -0400 (0:00:00.097) 0:01:46.658 ****** 30583 1726853771.32154: entering _queue_task() for managed_node2/shell 30583 1726853771.32532: worker is 1 (out of 1 available) 30583 1726853771.32543: exiting _queue_task() for managed_node2/shell 30583 1726853771.32555: done queuing things up, now waiting for results queue to drain 30583 1726853771.32556: waiting for pending results... 30583 1726853771.32859: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 30583 1726853771.32986: in run() - task 02083763-bbaf-05ea-abc5-00000000200b 30583 1726853771.33009: variable 'ansible_search_path' from source: unknown 30583 1726853771.33016: variable 'ansible_search_path' from source: unknown 30583 1726853771.33056: calling self._execute() 30583 1726853771.33168: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.33182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.33197: variable 'omit' from source: magic vars 30583 1726853771.33599: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.33617: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.33629: variable 'omit' from source: magic vars 30583 1726853771.33685: variable 'omit' from source: magic vars 30583 1726853771.33833: variable 'interface' from source: play vars 30583 1726853771.33862: variable 'omit' from source: magic vars 30583 1726853771.33912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853771.34176: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853771.34179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853771.34182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.34184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.34186: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853771.34188: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.34190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.34192: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853771.34194: Set connection var ansible_timeout to 10 30583 1726853771.34196: Set connection var ansible_connection to ssh 30583 1726853771.34198: Set connection var ansible_shell_executable to /bin/sh 30583 1726853771.34200: Set connection var ansible_shell_type to sh 30583 1726853771.34202: Set connection var ansible_pipelining to False 30583 1726853771.34210: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.34218: variable 'ansible_connection' from source: unknown 30583 1726853771.34224: variable 'ansible_module_compression' from source: unknown 30583 1726853771.34230: variable 'ansible_shell_type' from source: unknown 30583 1726853771.34237: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.34243: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.34250: variable 'ansible_pipelining' from source: unknown 30583 1726853771.34256: variable 'ansible_timeout' from source: unknown 30583 1726853771.34263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.34407: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853771.34431: variable 'omit' from source: magic vars 30583 1726853771.34441: starting attempt loop 30583 1726853771.34448: running the handler 30583 1726853771.34462: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853771.34491: _low_level_execute_command(): starting 30583 1726853771.34503: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853771.35238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853771.35287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853771.35311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853771.35386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853771.35410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853771.35435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853771.35538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853771.37286: stdout chunk (state=3): >>>/root <<< 30583 1726853771.37412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853771.37424: stderr chunk (state=3): >>><<< 30583 1726853771.37427: stdout chunk (state=3): >>><<< 30583 1726853771.37451: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853771.37464: _low_level_execute_command(): starting 30583 1726853771.37470: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176 `" && echo ansible-tmp-1726853771.3744993-35592-100496386746176="` echo /root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176 `" ) && sleep 0' 30583 1726853771.37906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853771.37909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853771.37922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853771.37924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853771.37927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853771.37969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853771.37975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853771.38050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853771.40038: stdout chunk (state=3): >>>ansible-tmp-1726853771.3744993-35592-100496386746176=/root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176 <<< 30583 1726853771.40144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853771.40202: stderr chunk (state=3): >>><<< 30583 1726853771.40206: stdout chunk (state=3): >>><<< 30583 1726853771.40209: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853771.3744993-35592-100496386746176=/root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853771.40212: variable 'ansible_module_compression' from source: unknown 30583 1726853771.40252: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853771.40284: variable 'ansible_facts' from source: unknown 30583 1726853771.40339: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176/AnsiballZ_command.py 30583 1726853771.40434: Sending initial data 30583 1726853771.40438: Sent initial data (156 bytes) 30583 1726853771.40846: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853771.40850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853771.40866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853771.40922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853771.40928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853771.40930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853771.41001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853771.42686: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853771.42753: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853771.42820: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp8kywhvdl /root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176/AnsiballZ_command.py <<< 30583 1726853771.42826: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176/AnsiballZ_command.py" <<< 30583 1726853771.42890: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp8kywhvdl" to remote "/root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176/AnsiballZ_command.py" <<< 30583 1726853771.42893: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176/AnsiballZ_command.py" <<< 30583 1726853771.43537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853771.43577: stderr chunk (state=3): >>><<< 30583 1726853771.43580: stdout chunk (state=3): >>><<< 30583 1726853771.43602: done transferring module to remote 30583 1726853771.43611: _low_level_execute_command(): starting 30583 1726853771.43615: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176/ /root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176/AnsiballZ_command.py && sleep 0' 30583 1726853771.44046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853771.44049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853771.44051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853771.44053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853771.44059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853771.44107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853771.44111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853771.44191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853771.46085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853771.46113: stderr chunk (state=3): >>><<< 30583 1726853771.46116: stdout chunk (state=3): >>><<< 30583 1726853771.46130: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853771.46133: _low_level_execute_command(): starting 30583 1726853771.46138: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176/AnsiballZ_command.py && sleep 0' 30583 1726853771.46544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853771.46582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853771.46585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853771.46587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853771.46590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853771.46638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853771.46644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853771.46647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853771.46723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853771.65729: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 13:36:11.623292", "end": "2024-09-20 13:36:11.656094", "delta": "0:00:00.032802", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853771.67327: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. <<< 30583 1726853771.67356: stderr chunk (state=3): >>><<< 30583 1726853771.67362: stdout chunk (state=3): >>><<< 30583 1726853771.67380: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 13:36:11.623292", "end": "2024-09-20 13:36:11.656094", "delta": "0:00:00.032802", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. 30583 1726853771.67410: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853771.67420: _low_level_execute_command(): starting 30583 1726853771.67423: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853771.3744993-35592-100496386746176/ > /dev/null 2>&1 && sleep 0' 30583 1726853771.67846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853771.67849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853771.67882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853771.67885: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853771.67887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853771.67942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853771.67948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853771.67950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853771.68022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853771.69959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853771.69989: stderr chunk (state=3): >>><<< 30583 1726853771.69992: stdout chunk (state=3): >>><<< 30583 1726853771.70010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853771.70016: handler run complete 30583 1726853771.70033: Evaluated conditional (False): False 30583 1726853771.70040: attempt loop complete, returning result 30583 1726853771.70043: _execute() done 30583 1726853771.70045: dumping result to json 30583 1726853771.70050: done dumping result, returning 30583 1726853771.70058: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [02083763-bbaf-05ea-abc5-00000000200b] 30583 1726853771.70064: sending task result for task 02083763-bbaf-05ea-abc5-00000000200b 30583 1726853771.70158: done sending task result for task 02083763-bbaf-05ea-abc5-00000000200b 30583 1726853771.70161: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.032802", "end": "2024-09-20 13:36:11.656094", "rc": 1, "start": "2024-09-20 13:36:11.623292" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30583 1726853771.70227: no more pending results, returning what we have 30583 1726853771.70232: results queue empty 30583 1726853771.70233: checking for any_errors_fatal 30583 1726853771.70234: done checking for any_errors_fatal 30583 1726853771.70235: checking for max_fail_percentage 30583 1726853771.70237: done checking for max_fail_percentage 30583 1726853771.70238: checking to see if all hosts have failed and the running result is not ok 30583 1726853771.70239: done checking to see if all hosts have failed 30583 1726853771.70240: getting the remaining hosts for this loop 30583 1726853771.70241: done getting the remaining hosts for this loop 30583 1726853771.70245: getting the next task for host managed_node2 30583 1726853771.70257: done getting next task for host managed_node2 30583 1726853771.70259: ^ task is: TASK: Include the task 'run_test.yml' 30583 1726853771.70261: ^ state is: HOST STATE: block=8, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853771.70265: getting variables 30583 1726853771.70267: in VariableManager get_vars() 30583 1726853771.70313: Calling all_inventory to load vars for managed_node2 30583 1726853771.70316: Calling groups_inventory to load vars for managed_node2 30583 1726853771.70319: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853771.70330: Calling all_plugins_play to load vars for managed_node2 30583 1726853771.70333: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853771.70335: Calling groups_plugins_play to load vars for managed_node2 30583 1726853771.71314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853771.72794: done with get_vars() 30583 1726853771.72819: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:124 Friday 20 September 2024 13:36:11 -0400 (0:00:00.407) 0:01:47.066 ****** 30583 1726853771.72894: entering _queue_task() for managed_node2/include_tasks 30583 1726853771.73169: worker is 1 (out of 1 available) 30583 1726853771.73185: exiting _queue_task() for managed_node2/include_tasks 30583 1726853771.73197: done queuing things up, now waiting for results queue to drain 30583 1726853771.73199: waiting for pending results... 30583 1726853771.73390: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 30583 1726853771.73460: in run() - task 02083763-bbaf-05ea-abc5-000000000017 30583 1726853771.73469: variable 'ansible_search_path' from source: unknown 30583 1726853771.73500: calling self._execute() 30583 1726853771.73581: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.73585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.73594: variable 'omit' from source: magic vars 30583 1726853771.73880: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.73891: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.73896: _execute() done 30583 1726853771.73899: dumping result to json 30583 1726853771.73902: done dumping result, returning 30583 1726853771.73909: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [02083763-bbaf-05ea-abc5-000000000017] 30583 1726853771.73913: sending task result for task 02083763-bbaf-05ea-abc5-000000000017 30583 1726853771.74018: done sending task result for task 02083763-bbaf-05ea-abc5-000000000017 30583 1726853771.74021: WORKER PROCESS EXITING 30583 1726853771.74060: no more pending results, returning what we have 30583 1726853771.74066: in VariableManager get_vars() 30583 1726853771.74120: Calling all_inventory to load vars for managed_node2 30583 1726853771.74123: Calling groups_inventory to load vars for managed_node2 30583 1726853771.74126: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853771.74141: Calling all_plugins_play to load vars for managed_node2 30583 1726853771.74144: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853771.74146: Calling groups_plugins_play to load vars for managed_node2 30583 1726853771.75655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853771.77077: done with get_vars() 30583 1726853771.77101: variable 'ansible_search_path' from source: unknown 30583 1726853771.77115: we have included files to process 30583 1726853771.77116: generating all_blocks data 30583 1726853771.77117: done generating all_blocks data 30583 1726853771.77120: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853771.77121: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853771.77123: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30583 1726853771.77398: in VariableManager get_vars() 30583 1726853771.77413: done with get_vars() 30583 1726853771.77441: in VariableManager get_vars() 30583 1726853771.77453: done with get_vars() 30583 1726853771.77481: in VariableManager get_vars() 30583 1726853771.77491: done with get_vars() 30583 1726853771.77515: in VariableManager get_vars() 30583 1726853771.77526: done with get_vars() 30583 1726853771.77552: in VariableManager get_vars() 30583 1726853771.77563: done with get_vars() 30583 1726853771.77830: in VariableManager get_vars() 30583 1726853771.77843: done with get_vars() 30583 1726853771.77851: done processing included file 30583 1726853771.77853: iterating over new_blocks loaded from include file 30583 1726853771.77854: in VariableManager get_vars() 30583 1726853771.77863: done with get_vars() 30583 1726853771.77864: filtering new block on tags 30583 1726853771.77923: done filtering new block on tags 30583 1726853771.77926: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 30583 1726853771.77929: extending task lists for all hosts with included blocks 30583 1726853771.77951: done extending task lists 30583 1726853771.77951: done processing included files 30583 1726853771.77952: results queue empty 30583 1726853771.77952: checking for any_errors_fatal 30583 1726853771.77956: done checking for any_errors_fatal 30583 1726853771.77956: checking for max_fail_percentage 30583 1726853771.77957: done checking for max_fail_percentage 30583 1726853771.77957: checking to see if all hosts have failed and the running result is not ok 30583 1726853771.77958: done checking to see if all hosts have failed 30583 1726853771.77959: getting the remaining hosts for this loop 30583 1726853771.77960: done getting the remaining hosts for this loop 30583 1726853771.77962: getting the next task for host managed_node2 30583 1726853771.77966: done getting next task for host managed_node2 30583 1726853771.77968: ^ task is: TASK: TEST: {{ lsr_description }} 30583 1726853771.77970: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853771.77973: getting variables 30583 1726853771.77974: in VariableManager get_vars() 30583 1726853771.77981: Calling all_inventory to load vars for managed_node2 30583 1726853771.77982: Calling groups_inventory to load vars for managed_node2 30583 1726853771.77984: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853771.77988: Calling all_plugins_play to load vars for managed_node2 30583 1726853771.77989: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853771.77991: Calling groups_plugins_play to load vars for managed_node2 30583 1726853771.78699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853771.80030: done with get_vars() 30583 1726853771.80052: done getting variables 30583 1726853771.80092: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853771.80178: variable 'lsr_description' from source: include params TASK [TEST: I will not get an error when I try to remove an absent profile] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 13:36:11 -0400 (0:00:00.073) 0:01:47.139 ****** 30583 1726853771.80202: entering _queue_task() for managed_node2/debug 30583 1726853771.80475: worker is 1 (out of 1 available) 30583 1726853771.80488: exiting _queue_task() for managed_node2/debug 30583 1726853771.80501: done queuing things up, now waiting for results queue to drain 30583 1726853771.80502: waiting for pending results... 30583 1726853771.80701: running TaskExecutor() for managed_node2/TASK: TEST: I will not get an error when I try to remove an absent profile 30583 1726853771.80786: in run() - task 02083763-bbaf-05ea-abc5-0000000020ad 30583 1726853771.80796: variable 'ansible_search_path' from source: unknown 30583 1726853771.80800: variable 'ansible_search_path' from source: unknown 30583 1726853771.80828: calling self._execute() 30583 1726853771.80909: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.80913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.80923: variable 'omit' from source: magic vars 30583 1726853771.81215: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.81225: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.81231: variable 'omit' from source: magic vars 30583 1726853771.81256: variable 'omit' from source: magic vars 30583 1726853771.81331: variable 'lsr_description' from source: include params 30583 1726853771.81345: variable 'omit' from source: magic vars 30583 1726853771.81388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853771.81413: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853771.81431: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853771.81445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.81456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.81485: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853771.81490: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.81492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.81561: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853771.81568: Set connection var ansible_timeout to 10 30583 1726853771.81573: Set connection var ansible_connection to ssh 30583 1726853771.81577: Set connection var ansible_shell_executable to /bin/sh 30583 1726853771.81580: Set connection var ansible_shell_type to sh 30583 1726853771.81589: Set connection var ansible_pipelining to False 30583 1726853771.81609: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.81612: variable 'ansible_connection' from source: unknown 30583 1726853771.81615: variable 'ansible_module_compression' from source: unknown 30583 1726853771.81618: variable 'ansible_shell_type' from source: unknown 30583 1726853771.81620: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.81622: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.81624: variable 'ansible_pipelining' from source: unknown 30583 1726853771.81628: variable 'ansible_timeout' from source: unknown 30583 1726853771.81631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.81739: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853771.81746: variable 'omit' from source: magic vars 30583 1726853771.81751: starting attempt loop 30583 1726853771.81754: running the handler 30583 1726853771.81796: handler run complete 30583 1726853771.81808: attempt loop complete, returning result 30583 1726853771.81811: _execute() done 30583 1726853771.81813: dumping result to json 30583 1726853771.81815: done dumping result, returning 30583 1726853771.81824: done running TaskExecutor() for managed_node2/TASK: TEST: I will not get an error when I try to remove an absent profile [02083763-bbaf-05ea-abc5-0000000020ad] 30583 1726853771.81826: sending task result for task 02083763-bbaf-05ea-abc5-0000000020ad 30583 1726853771.81908: done sending task result for task 02083763-bbaf-05ea-abc5-0000000020ad 30583 1726853771.81910: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I will not get an error when I try to remove an absent profile ########## 30583 1726853771.81976: no more pending results, returning what we have 30583 1726853771.81981: results queue empty 30583 1726853771.81982: checking for any_errors_fatal 30583 1726853771.81984: done checking for any_errors_fatal 30583 1726853771.81985: checking for max_fail_percentage 30583 1726853771.81986: done checking for max_fail_percentage 30583 1726853771.81987: checking to see if all hosts have failed and the running result is not ok 30583 1726853771.81988: done checking to see if all hosts have failed 30583 1726853771.81989: getting the remaining hosts for this loop 30583 1726853771.81990: done getting the remaining hosts for this loop 30583 1726853771.81994: getting the next task for host managed_node2 30583 1726853771.82002: done getting next task for host managed_node2 30583 1726853771.82005: ^ task is: TASK: Show item 30583 1726853771.82008: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853771.82012: getting variables 30583 1726853771.82014: in VariableManager get_vars() 30583 1726853771.82060: Calling all_inventory to load vars for managed_node2 30583 1726853771.82063: Calling groups_inventory to load vars for managed_node2 30583 1726853771.82066: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853771.82078: Calling all_plugins_play to load vars for managed_node2 30583 1726853771.82081: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853771.82083: Calling groups_plugins_play to load vars for managed_node2 30583 1726853771.82913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853771.83776: done with get_vars() 30583 1726853771.83793: done getting variables 30583 1726853771.83837: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 13:36:11 -0400 (0:00:00.036) 0:01:47.175 ****** 30583 1726853771.83859: entering _queue_task() for managed_node2/debug 30583 1726853771.84111: worker is 1 (out of 1 available) 30583 1726853771.84125: exiting _queue_task() for managed_node2/debug 30583 1726853771.84137: done queuing things up, now waiting for results queue to drain 30583 1726853771.84139: waiting for pending results... 30583 1726853771.84329: running TaskExecutor() for managed_node2/TASK: Show item 30583 1726853771.84406: in run() - task 02083763-bbaf-05ea-abc5-0000000020ae 30583 1726853771.84418: variable 'ansible_search_path' from source: unknown 30583 1726853771.84421: variable 'ansible_search_path' from source: unknown 30583 1726853771.84461: variable 'omit' from source: magic vars 30583 1726853771.84579: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.84584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.84595: variable 'omit' from source: magic vars 30583 1726853771.84862: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.84874: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.84880: variable 'omit' from source: magic vars 30583 1726853771.84904: variable 'omit' from source: magic vars 30583 1726853771.84935: variable 'item' from source: unknown 30583 1726853771.84988: variable 'item' from source: unknown 30583 1726853771.85000: variable 'omit' from source: magic vars 30583 1726853771.85035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853771.85063: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853771.85083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853771.85098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.85109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.85136: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853771.85139: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.85142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.85213: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853771.85217: Set connection var ansible_timeout to 10 30583 1726853771.85219: Set connection var ansible_connection to ssh 30583 1726853771.85225: Set connection var ansible_shell_executable to /bin/sh 30583 1726853771.85228: Set connection var ansible_shell_type to sh 30583 1726853771.85238: Set connection var ansible_pipelining to False 30583 1726853771.85254: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.85257: variable 'ansible_connection' from source: unknown 30583 1726853771.85259: variable 'ansible_module_compression' from source: unknown 30583 1726853771.85264: variable 'ansible_shell_type' from source: unknown 30583 1726853771.85266: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.85268: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.85273: variable 'ansible_pipelining' from source: unknown 30583 1726853771.85276: variable 'ansible_timeout' from source: unknown 30583 1726853771.85280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.85383: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853771.85392: variable 'omit' from source: magic vars 30583 1726853771.85397: starting attempt loop 30583 1726853771.85400: running the handler 30583 1726853771.85439: variable 'lsr_description' from source: include params 30583 1726853771.85488: variable 'lsr_description' from source: include params 30583 1726853771.85496: handler run complete 30583 1726853771.85510: attempt loop complete, returning result 30583 1726853771.85521: variable 'item' from source: unknown 30583 1726853771.85575: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I will not get an error when I try to remove an absent profile" } 30583 1726853771.85710: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.85713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.85716: variable 'omit' from source: magic vars 30583 1726853771.85788: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.85792: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.85796: variable 'omit' from source: magic vars 30583 1726853771.85807: variable 'omit' from source: magic vars 30583 1726853771.85836: variable 'item' from source: unknown 30583 1726853771.85882: variable 'item' from source: unknown 30583 1726853771.85894: variable 'omit' from source: magic vars 30583 1726853771.85907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853771.85913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.85919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.85928: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853771.85930: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.85935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.85983: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853771.85986: Set connection var ansible_timeout to 10 30583 1726853771.85988: Set connection var ansible_connection to ssh 30583 1726853771.85994: Set connection var ansible_shell_executable to /bin/sh 30583 1726853771.85996: Set connection var ansible_shell_type to sh 30583 1726853771.86003: Set connection var ansible_pipelining to False 30583 1726853771.86019: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.86021: variable 'ansible_connection' from source: unknown 30583 1726853771.86024: variable 'ansible_module_compression' from source: unknown 30583 1726853771.86027: variable 'ansible_shell_type' from source: unknown 30583 1726853771.86029: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.86031: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.86033: variable 'ansible_pipelining' from source: unknown 30583 1726853771.86037: variable 'ansible_timeout' from source: unknown 30583 1726853771.86041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.86102: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853771.86110: variable 'omit' from source: magic vars 30583 1726853771.86112: starting attempt loop 30583 1726853771.86115: running the handler 30583 1726853771.86131: variable 'lsr_setup' from source: include params 30583 1726853771.86184: variable 'lsr_setup' from source: include params 30583 1726853771.86218: handler run complete 30583 1726853771.86229: attempt loop complete, returning result 30583 1726853771.86241: variable 'item' from source: unknown 30583 1726853771.86290: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove+down_profile.yml" ] } 30583 1726853771.86367: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.86374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.86377: variable 'omit' from source: magic vars 30583 1726853771.86474: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.86478: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.86482: variable 'omit' from source: magic vars 30583 1726853771.86494: variable 'omit' from source: magic vars 30583 1726853771.86520: variable 'item' from source: unknown 30583 1726853771.86564: variable 'item' from source: unknown 30583 1726853771.86576: variable 'omit' from source: magic vars 30583 1726853771.86589: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853771.86598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.86601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.86612: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853771.86615: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.86617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.86659: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853771.86666: Set connection var ansible_timeout to 10 30583 1726853771.86669: Set connection var ansible_connection to ssh 30583 1726853771.86675: Set connection var ansible_shell_executable to /bin/sh 30583 1726853771.86677: Set connection var ansible_shell_type to sh 30583 1726853771.86684: Set connection var ansible_pipelining to False 30583 1726853771.86699: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.86703: variable 'ansible_connection' from source: unknown 30583 1726853771.86706: variable 'ansible_module_compression' from source: unknown 30583 1726853771.86708: variable 'ansible_shell_type' from source: unknown 30583 1726853771.86712: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.86714: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.86716: variable 'ansible_pipelining' from source: unknown 30583 1726853771.86718: variable 'ansible_timeout' from source: unknown 30583 1726853771.86720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.86779: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853771.86785: variable 'omit' from source: magic vars 30583 1726853771.86787: starting attempt loop 30583 1726853771.86790: running the handler 30583 1726853771.86803: variable 'lsr_test' from source: include params 30583 1726853771.86849: variable 'lsr_test' from source: include params 30583 1726853771.86864: handler run complete 30583 1726853771.86876: attempt loop complete, returning result 30583 1726853771.86887: variable 'item' from source: unknown 30583 1726853771.86928: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 30583 1726853771.87002: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.87005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.87008: variable 'omit' from source: magic vars 30583 1726853771.87109: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.87112: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.87119: variable 'omit' from source: magic vars 30583 1726853771.87130: variable 'omit' from source: magic vars 30583 1726853771.87155: variable 'item' from source: unknown 30583 1726853771.87203: variable 'item' from source: unknown 30583 1726853771.87213: variable 'omit' from source: magic vars 30583 1726853771.87226: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853771.87232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.87240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.87247: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853771.87250: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.87252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.87300: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853771.87303: Set connection var ansible_timeout to 10 30583 1726853771.87305: Set connection var ansible_connection to ssh 30583 1726853771.87310: Set connection var ansible_shell_executable to /bin/sh 30583 1726853771.87313: Set connection var ansible_shell_type to sh 30583 1726853771.87320: Set connection var ansible_pipelining to False 30583 1726853771.87335: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.87337: variable 'ansible_connection' from source: unknown 30583 1726853771.87340: variable 'ansible_module_compression' from source: unknown 30583 1726853771.87348: variable 'ansible_shell_type' from source: unknown 30583 1726853771.87351: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.87353: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.87355: variable 'ansible_pipelining' from source: unknown 30583 1726853771.87357: variable 'ansible_timeout' from source: unknown 30583 1726853771.87359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.87415: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853771.87420: variable 'omit' from source: magic vars 30583 1726853771.87424: starting attempt loop 30583 1726853771.87426: running the handler 30583 1726853771.87440: variable 'lsr_assert' from source: include params 30583 1726853771.87489: variable 'lsr_assert' from source: include params 30583 1726853771.87502: handler run complete 30583 1726853771.87511: attempt loop complete, returning result 30583 1726853771.87521: variable 'item' from source: unknown 30583 1726853771.87567: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml", "tasks/get_NetworkManager_NVR.yml" ] } 30583 1726853771.87637: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.87640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.87643: variable 'omit' from source: magic vars 30583 1726853771.87785: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.87793: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.87795: variable 'omit' from source: magic vars 30583 1726853771.87804: variable 'omit' from source: magic vars 30583 1726853771.87830: variable 'item' from source: unknown 30583 1726853771.87875: variable 'item' from source: unknown 30583 1726853771.87890: variable 'omit' from source: magic vars 30583 1726853771.87901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853771.87907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.87913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.87921: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853771.87923: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.87926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.87971: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853771.87975: Set connection var ansible_timeout to 10 30583 1726853771.87977: Set connection var ansible_connection to ssh 30583 1726853771.87982: Set connection var ansible_shell_executable to /bin/sh 30583 1726853771.87984: Set connection var ansible_shell_type to sh 30583 1726853771.87993: Set connection var ansible_pipelining to False 30583 1726853771.88009: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.88012: variable 'ansible_connection' from source: unknown 30583 1726853771.88015: variable 'ansible_module_compression' from source: unknown 30583 1726853771.88017: variable 'ansible_shell_type' from source: unknown 30583 1726853771.88019: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.88021: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.88025: variable 'ansible_pipelining' from source: unknown 30583 1726853771.88027: variable 'ansible_timeout' from source: unknown 30583 1726853771.88031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.88091: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853771.88098: variable 'omit' from source: magic vars 30583 1726853771.88102: starting attempt loop 30583 1726853771.88106: running the handler 30583 1726853771.88121: variable 'lsr_assert_when' from source: include params 30583 1726853771.88162: variable 'lsr_assert_when' from source: include params 30583 1726853771.88223: variable 'network_provider' from source: set_fact 30583 1726853771.88249: handler run complete 30583 1726853771.88259: attempt loop complete, returning result 30583 1726853771.88273: variable 'item' from source: unknown 30583 1726853771.88315: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 30583 1726853771.88389: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.88392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.88395: variable 'omit' from source: magic vars 30583 1726853771.88493: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.88496: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.88500: variable 'omit' from source: magic vars 30583 1726853771.88513: variable 'omit' from source: magic vars 30583 1726853771.88539: variable 'item' from source: unknown 30583 1726853771.88584: variable 'item' from source: unknown 30583 1726853771.88595: variable 'omit' from source: magic vars 30583 1726853771.88608: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853771.88617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.88620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.88629: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853771.88632: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.88635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.88682: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853771.88685: Set connection var ansible_timeout to 10 30583 1726853771.88688: Set connection var ansible_connection to ssh 30583 1726853771.88693: Set connection var ansible_shell_executable to /bin/sh 30583 1726853771.88695: Set connection var ansible_shell_type to sh 30583 1726853771.88702: Set connection var ansible_pipelining to False 30583 1726853771.88716: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.88719: variable 'ansible_connection' from source: unknown 30583 1726853771.88723: variable 'ansible_module_compression' from source: unknown 30583 1726853771.88725: variable 'ansible_shell_type' from source: unknown 30583 1726853771.88727: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.88729: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.88731: variable 'ansible_pipelining' from source: unknown 30583 1726853771.88733: variable 'ansible_timeout' from source: unknown 30583 1726853771.88738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.88798: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853771.88804: variable 'omit' from source: magic vars 30583 1726853771.88806: starting attempt loop 30583 1726853771.88809: running the handler 30583 1726853771.88822: variable 'lsr_fail_debug' from source: play vars 30583 1726853771.88874: variable 'lsr_fail_debug' from source: play vars 30583 1726853771.88883: handler run complete 30583 1726853771.88893: attempt loop complete, returning result 30583 1726853771.88905: variable 'item' from source: unknown 30583 1726853771.88945: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30583 1726853771.89024: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.89027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.89029: variable 'omit' from source: magic vars 30583 1726853771.89123: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.89129: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.89134: variable 'omit' from source: magic vars 30583 1726853771.89143: variable 'omit' from source: magic vars 30583 1726853771.89176: variable 'item' from source: unknown 30583 1726853771.89216: variable 'item' from source: unknown 30583 1726853771.89227: variable 'omit' from source: magic vars 30583 1726853771.89243: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853771.89250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.89253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853771.89260: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853771.89265: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.89267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.89314: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853771.89317: Set connection var ansible_timeout to 10 30583 1726853771.89320: Set connection var ansible_connection to ssh 30583 1726853771.89325: Set connection var ansible_shell_executable to /bin/sh 30583 1726853771.89328: Set connection var ansible_shell_type to sh 30583 1726853771.89334: Set connection var ansible_pipelining to False 30583 1726853771.89351: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.89354: variable 'ansible_connection' from source: unknown 30583 1726853771.89356: variable 'ansible_module_compression' from source: unknown 30583 1726853771.89358: variable 'ansible_shell_type' from source: unknown 30583 1726853771.89360: variable 'ansible_shell_executable' from source: unknown 30583 1726853771.89362: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.89367: variable 'ansible_pipelining' from source: unknown 30583 1726853771.89369: variable 'ansible_timeout' from source: unknown 30583 1726853771.89374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.89430: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853771.89437: variable 'omit' from source: magic vars 30583 1726853771.89440: starting attempt loop 30583 1726853771.89442: running the handler 30583 1726853771.89457: variable 'lsr_cleanup' from source: include params 30583 1726853771.89501: variable 'lsr_cleanup' from source: include params 30583 1726853771.89516: handler run complete 30583 1726853771.89527: attempt loop complete, returning result 30583 1726853771.89537: variable 'item' from source: unknown 30583 1726853771.89583: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml", "tasks/check_network_dns.yml" ] } 30583 1726853771.89655: dumping result to json 30583 1726853771.89658: done dumping result, returning 30583 1726853771.89660: done running TaskExecutor() for managed_node2/TASK: Show item [02083763-bbaf-05ea-abc5-0000000020ae] 30583 1726853771.89662: sending task result for task 02083763-bbaf-05ea-abc5-0000000020ae 30583 1726853771.89704: done sending task result for task 02083763-bbaf-05ea-abc5-0000000020ae 30583 1726853771.89707: WORKER PROCESS EXITING 30583 1726853771.89756: no more pending results, returning what we have 30583 1726853771.89759: results queue empty 30583 1726853771.89760: checking for any_errors_fatal 30583 1726853771.89768: done checking for any_errors_fatal 30583 1726853771.89769: checking for max_fail_percentage 30583 1726853771.89772: done checking for max_fail_percentage 30583 1726853771.89773: checking to see if all hosts have failed and the running result is not ok 30583 1726853771.89774: done checking to see if all hosts have failed 30583 1726853771.89774: getting the remaining hosts for this loop 30583 1726853771.89776: done getting the remaining hosts for this loop 30583 1726853771.89780: getting the next task for host managed_node2 30583 1726853771.89787: done getting next task for host managed_node2 30583 1726853771.89789: ^ task is: TASK: Include the task 'show_interfaces.yml' 30583 1726853771.89792: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853771.89796: getting variables 30583 1726853771.89798: in VariableManager get_vars() 30583 1726853771.89844: Calling all_inventory to load vars for managed_node2 30583 1726853771.89847: Calling groups_inventory to load vars for managed_node2 30583 1726853771.89851: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853771.89861: Calling all_plugins_play to load vars for managed_node2 30583 1726853771.89864: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853771.89867: Calling groups_plugins_play to load vars for managed_node2 30583 1726853771.90846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853771.91707: done with get_vars() 30583 1726853771.91725: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 13:36:11 -0400 (0:00:00.079) 0:01:47.255 ****** 30583 1726853771.91795: entering _queue_task() for managed_node2/include_tasks 30583 1726853771.92057: worker is 1 (out of 1 available) 30583 1726853771.92069: exiting _queue_task() for managed_node2/include_tasks 30583 1726853771.92085: done queuing things up, now waiting for results queue to drain 30583 1726853771.92086: waiting for pending results... 30583 1726853771.92283: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 30583 1726853771.92362: in run() - task 02083763-bbaf-05ea-abc5-0000000020af 30583 1726853771.92379: variable 'ansible_search_path' from source: unknown 30583 1726853771.92382: variable 'ansible_search_path' from source: unknown 30583 1726853771.92410: calling self._execute() 30583 1726853771.92491: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.92495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.92505: variable 'omit' from source: magic vars 30583 1726853771.92796: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.92806: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.92811: _execute() done 30583 1726853771.92814: dumping result to json 30583 1726853771.92817: done dumping result, returning 30583 1726853771.92823: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-05ea-abc5-0000000020af] 30583 1726853771.92827: sending task result for task 02083763-bbaf-05ea-abc5-0000000020af 30583 1726853771.92913: done sending task result for task 02083763-bbaf-05ea-abc5-0000000020af 30583 1726853771.92915: WORKER PROCESS EXITING 30583 1726853771.92942: no more pending results, returning what we have 30583 1726853771.92947: in VariableManager get_vars() 30583 1726853771.92998: Calling all_inventory to load vars for managed_node2 30583 1726853771.93001: Calling groups_inventory to load vars for managed_node2 30583 1726853771.93005: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853771.93017: Calling all_plugins_play to load vars for managed_node2 30583 1726853771.93020: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853771.93023: Calling groups_plugins_play to load vars for managed_node2 30583 1726853771.93868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853771.94736: done with get_vars() 30583 1726853771.94750: variable 'ansible_search_path' from source: unknown 30583 1726853771.94751: variable 'ansible_search_path' from source: unknown 30583 1726853771.94782: we have included files to process 30583 1726853771.94783: generating all_blocks data 30583 1726853771.94784: done generating all_blocks data 30583 1726853771.94787: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853771.94788: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853771.94789: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30583 1726853771.94863: in VariableManager get_vars() 30583 1726853771.94879: done with get_vars() 30583 1726853771.94955: done processing included file 30583 1726853771.94956: iterating over new_blocks loaded from include file 30583 1726853771.94957: in VariableManager get_vars() 30583 1726853771.94970: done with get_vars() 30583 1726853771.94973: filtering new block on tags 30583 1726853771.94993: done filtering new block on tags 30583 1726853771.94994: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 30583 1726853771.94998: extending task lists for all hosts with included blocks 30583 1726853771.95260: done extending task lists 30583 1726853771.95261: done processing included files 30583 1726853771.95262: results queue empty 30583 1726853771.95262: checking for any_errors_fatal 30583 1726853771.95266: done checking for any_errors_fatal 30583 1726853771.95267: checking for max_fail_percentage 30583 1726853771.95268: done checking for max_fail_percentage 30583 1726853771.95268: checking to see if all hosts have failed and the running result is not ok 30583 1726853771.95269: done checking to see if all hosts have failed 30583 1726853771.95269: getting the remaining hosts for this loop 30583 1726853771.95270: done getting the remaining hosts for this loop 30583 1726853771.95273: getting the next task for host managed_node2 30583 1726853771.95276: done getting next task for host managed_node2 30583 1726853771.95278: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30583 1726853771.95280: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853771.95282: getting variables 30583 1726853771.95282: in VariableManager get_vars() 30583 1726853771.95290: Calling all_inventory to load vars for managed_node2 30583 1726853771.95291: Calling groups_inventory to load vars for managed_node2 30583 1726853771.95293: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853771.95296: Calling all_plugins_play to load vars for managed_node2 30583 1726853771.95298: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853771.95299: Calling groups_plugins_play to load vars for managed_node2 30583 1726853771.96037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853771.96884: done with get_vars() 30583 1726853771.96900: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:36:11 -0400 (0:00:00.051) 0:01:47.306 ****** 30583 1726853771.96951: entering _queue_task() for managed_node2/include_tasks 30583 1726853771.97224: worker is 1 (out of 1 available) 30583 1726853771.97237: exiting _queue_task() for managed_node2/include_tasks 30583 1726853771.97250: done queuing things up, now waiting for results queue to drain 30583 1726853771.97251: waiting for pending results... 30583 1726853771.97442: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 30583 1726853771.97523: in run() - task 02083763-bbaf-05ea-abc5-0000000020d6 30583 1726853771.97534: variable 'ansible_search_path' from source: unknown 30583 1726853771.97539: variable 'ansible_search_path' from source: unknown 30583 1726853771.97568: calling self._execute() 30583 1726853771.97649: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853771.97652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853771.97663: variable 'omit' from source: magic vars 30583 1726853771.97946: variable 'ansible_distribution_major_version' from source: facts 30583 1726853771.97956: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853771.97962: _execute() done 30583 1726853771.97965: dumping result to json 30583 1726853771.97968: done dumping result, returning 30583 1726853771.97976: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-05ea-abc5-0000000020d6] 30583 1726853771.97980: sending task result for task 02083763-bbaf-05ea-abc5-0000000020d6 30583 1726853771.98068: done sending task result for task 02083763-bbaf-05ea-abc5-0000000020d6 30583 1726853771.98073: WORKER PROCESS EXITING 30583 1726853771.98103: no more pending results, returning what we have 30583 1726853771.98107: in VariableManager get_vars() 30583 1726853771.98155: Calling all_inventory to load vars for managed_node2 30583 1726853771.98160: Calling groups_inventory to load vars for managed_node2 30583 1726853771.98164: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853771.98178: Calling all_plugins_play to load vars for managed_node2 30583 1726853771.98181: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853771.98185: Calling groups_plugins_play to load vars for managed_node2 30583 1726853771.99023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.04551: done with get_vars() 30583 1726853772.04575: variable 'ansible_search_path' from source: unknown 30583 1726853772.04577: variable 'ansible_search_path' from source: unknown 30583 1726853772.04604: we have included files to process 30583 1726853772.04605: generating all_blocks data 30583 1726853772.04606: done generating all_blocks data 30583 1726853772.04607: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853772.04607: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853772.04609: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30583 1726853772.04777: done processing included file 30583 1726853772.04779: iterating over new_blocks loaded from include file 30583 1726853772.04780: in VariableManager get_vars() 30583 1726853772.04791: done with get_vars() 30583 1726853772.04792: filtering new block on tags 30583 1726853772.04815: done filtering new block on tags 30583 1726853772.04817: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 30583 1726853772.04820: extending task lists for all hosts with included blocks 30583 1726853772.04912: done extending task lists 30583 1726853772.04914: done processing included files 30583 1726853772.04914: results queue empty 30583 1726853772.04915: checking for any_errors_fatal 30583 1726853772.04917: done checking for any_errors_fatal 30583 1726853772.04918: checking for max_fail_percentage 30583 1726853772.04919: done checking for max_fail_percentage 30583 1726853772.04919: checking to see if all hosts have failed and the running result is not ok 30583 1726853772.04920: done checking to see if all hosts have failed 30583 1726853772.04920: getting the remaining hosts for this loop 30583 1726853772.04921: done getting the remaining hosts for this loop 30583 1726853772.04923: getting the next task for host managed_node2 30583 1726853772.04925: done getting next task for host managed_node2 30583 1726853772.04927: ^ task is: TASK: Gather current interface info 30583 1726853772.04929: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853772.04930: getting variables 30583 1726853772.04931: in VariableManager get_vars() 30583 1726853772.04938: Calling all_inventory to load vars for managed_node2 30583 1726853772.04939: Calling groups_inventory to load vars for managed_node2 30583 1726853772.04941: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853772.04945: Calling all_plugins_play to load vars for managed_node2 30583 1726853772.04946: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853772.04948: Calling groups_plugins_play to load vars for managed_node2 30583 1726853772.05624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.06472: done with get_vars() 30583 1726853772.06487: done getting variables 30583 1726853772.06513: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:36:12 -0400 (0:00:00.095) 0:01:47.402 ****** 30583 1726853772.06533: entering _queue_task() for managed_node2/command 30583 1726853772.06817: worker is 1 (out of 1 available) 30583 1726853772.06831: exiting _queue_task() for managed_node2/command 30583 1726853772.06843: done queuing things up, now waiting for results queue to drain 30583 1726853772.06845: waiting for pending results... 30583 1726853772.07032: running TaskExecutor() for managed_node2/TASK: Gather current interface info 30583 1726853772.07126: in run() - task 02083763-bbaf-05ea-abc5-000000002111 30583 1726853772.07136: variable 'ansible_search_path' from source: unknown 30583 1726853772.07140: variable 'ansible_search_path' from source: unknown 30583 1726853772.07168: calling self._execute() 30583 1726853772.07249: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.07255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.07265: variable 'omit' from source: magic vars 30583 1726853772.07557: variable 'ansible_distribution_major_version' from source: facts 30583 1726853772.07568: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853772.07574: variable 'omit' from source: magic vars 30583 1726853772.07610: variable 'omit' from source: magic vars 30583 1726853772.07639: variable 'omit' from source: magic vars 30583 1726853772.07670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853772.07697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853772.07715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853772.07728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853772.07737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853772.07765: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853772.07768: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.07772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.07843: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853772.07851: Set connection var ansible_timeout to 10 30583 1726853772.07854: Set connection var ansible_connection to ssh 30583 1726853772.07856: Set connection var ansible_shell_executable to /bin/sh 30583 1726853772.07861: Set connection var ansible_shell_type to sh 30583 1726853772.07869: Set connection var ansible_pipelining to False 30583 1726853772.07888: variable 'ansible_shell_executable' from source: unknown 30583 1726853772.07891: variable 'ansible_connection' from source: unknown 30583 1726853772.07894: variable 'ansible_module_compression' from source: unknown 30583 1726853772.07896: variable 'ansible_shell_type' from source: unknown 30583 1726853772.07898: variable 'ansible_shell_executable' from source: unknown 30583 1726853772.07900: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.07902: variable 'ansible_pipelining' from source: unknown 30583 1726853772.07905: variable 'ansible_timeout' from source: unknown 30583 1726853772.07909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.08011: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853772.08020: variable 'omit' from source: magic vars 30583 1726853772.08025: starting attempt loop 30583 1726853772.08028: running the handler 30583 1726853772.08040: _low_level_execute_command(): starting 30583 1726853772.08047: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853772.08582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853772.08586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853772.08591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853772.08593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853772.08647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853772.08650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853772.08653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853772.08735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853772.10468: stdout chunk (state=3): >>>/root <<< 30583 1726853772.10567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853772.10608: stderr chunk (state=3): >>><<< 30583 1726853772.10611: stdout chunk (state=3): >>><<< 30583 1726853772.10632: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853772.10649: _low_level_execute_command(): starting 30583 1726853772.10654: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919 `" && echo ansible-tmp-1726853772.1063483-35620-34047594108919="` echo /root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919 `" ) && sleep 0' 30583 1726853772.11109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853772.11112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853772.11114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853772.11127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853772.11129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853772.11176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853772.11181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853772.11186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853772.11257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853772.13213: stdout chunk (state=3): >>>ansible-tmp-1726853772.1063483-35620-34047594108919=/root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919 <<< 30583 1726853772.13319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853772.13347: stderr chunk (state=3): >>><<< 30583 1726853772.13350: stdout chunk (state=3): >>><<< 30583 1726853772.13365: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853772.1063483-35620-34047594108919=/root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853772.13396: variable 'ansible_module_compression' from source: unknown 30583 1726853772.13442: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853772.13476: variable 'ansible_facts' from source: unknown 30583 1726853772.13534: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919/AnsiballZ_command.py 30583 1726853772.13633: Sending initial data 30583 1726853772.13637: Sent initial data (155 bytes) 30583 1726853772.14086: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853772.14090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853772.14093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853772.14095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853772.14097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853772.14100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853772.14149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853772.14154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853772.14157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853772.14224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853772.15896: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853772.15962: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853772.16034: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpr41ykpax /root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919/AnsiballZ_command.py <<< 30583 1726853772.16037: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919/AnsiballZ_command.py" <<< 30583 1726853772.16104: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpr41ykpax" to remote "/root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919/AnsiballZ_command.py" <<< 30583 1726853772.16106: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919/AnsiballZ_command.py" <<< 30583 1726853772.16787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853772.16809: stderr chunk (state=3): >>><<< 30583 1726853772.16812: stdout chunk (state=3): >>><<< 30583 1726853772.16828: done transferring module to remote 30583 1726853772.16838: _low_level_execute_command(): starting 30583 1726853772.16848: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919/ /root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919/AnsiballZ_command.py && sleep 0' 30583 1726853772.17294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853772.17298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853772.17300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853772.17302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853772.17308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853772.17310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853772.17356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853772.17363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853772.17431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853772.19319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853772.19342: stderr chunk (state=3): >>><<< 30583 1726853772.19347: stdout chunk (state=3): >>><<< 30583 1726853772.19365: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853772.19369: _low_level_execute_command(): starting 30583 1726853772.19374: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919/AnsiballZ_command.py && sleep 0' 30583 1726853772.19820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853772.19823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853772.19826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853772.19828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853772.19877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853772.19884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853772.19900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853772.19969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853772.36180: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:36:12.357177", "end": "2024-09-20 13:36:12.360672", "delta": "0:00:00.003495", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853772.37914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853772.37918: stdout chunk (state=3): >>><<< 30583 1726853772.37920: stderr chunk (state=3): >>><<< 30583 1726853772.38077: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:36:12.357177", "end": "2024-09-20 13:36:12.360672", "delta": "0:00:00.003495", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853772.38082: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853772.38085: _low_level_execute_command(): starting 30583 1726853772.38088: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853772.1063483-35620-34047594108919/ > /dev/null 2>&1 && sleep 0' 30583 1726853772.38732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853772.38748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853772.38776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853772.38889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853772.38907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853772.38920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853772.39092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853772.41037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853772.41052: stdout chunk (state=3): >>><<< 30583 1726853772.41066: stderr chunk (state=3): >>><<< 30583 1726853772.41089: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853772.41102: handler run complete 30583 1726853772.41130: Evaluated conditional (False): False 30583 1726853772.41151: attempt loop complete, returning result 30583 1726853772.41163: _execute() done 30583 1726853772.41262: dumping result to json 30583 1726853772.41265: done dumping result, returning 30583 1726853772.41268: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [02083763-bbaf-05ea-abc5-000000002111] 30583 1726853772.41270: sending task result for task 02083763-bbaf-05ea-abc5-000000002111 30583 1726853772.41341: done sending task result for task 02083763-bbaf-05ea-abc5-000000002111 30583 1726853772.41344: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003495", "end": "2024-09-20 13:36:12.360672", "rc": 0, "start": "2024-09-20 13:36:12.357177" } STDOUT: bonding_masters eth0 lo 30583 1726853772.41427: no more pending results, returning what we have 30583 1726853772.41431: results queue empty 30583 1726853772.41432: checking for any_errors_fatal 30583 1726853772.41433: done checking for any_errors_fatal 30583 1726853772.41435: checking for max_fail_percentage 30583 1726853772.41437: done checking for max_fail_percentage 30583 1726853772.41438: checking to see if all hosts have failed and the running result is not ok 30583 1726853772.41439: done checking to see if all hosts have failed 30583 1726853772.41440: getting the remaining hosts for this loop 30583 1726853772.41442: done getting the remaining hosts for this loop 30583 1726853772.41446: getting the next task for host managed_node2 30583 1726853772.41455: done getting next task for host managed_node2 30583 1726853772.41460: ^ task is: TASK: Set current_interfaces 30583 1726853772.41467: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853772.41474: getting variables 30583 1726853772.41476: in VariableManager get_vars() 30583 1726853772.41521: Calling all_inventory to load vars for managed_node2 30583 1726853772.41524: Calling groups_inventory to load vars for managed_node2 30583 1726853772.41527: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853772.41538: Calling all_plugins_play to load vars for managed_node2 30583 1726853772.41542: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853772.41545: Calling groups_plugins_play to load vars for managed_node2 30583 1726853772.43517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.45136: done with get_vars() 30583 1726853772.45167: done getting variables 30583 1726853772.45227: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:36:12 -0400 (0:00:00.387) 0:01:47.789 ****** 30583 1726853772.45264: entering _queue_task() for managed_node2/set_fact 30583 1726853772.45796: worker is 1 (out of 1 available) 30583 1726853772.45805: exiting _queue_task() for managed_node2/set_fact 30583 1726853772.45815: done queuing things up, now waiting for results queue to drain 30583 1726853772.45816: waiting for pending results... 30583 1726853772.46056: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 30583 1726853772.46123: in run() - task 02083763-bbaf-05ea-abc5-000000002112 30583 1726853772.46141: variable 'ansible_search_path' from source: unknown 30583 1726853772.46155: variable 'ansible_search_path' from source: unknown 30583 1726853772.46264: calling self._execute() 30583 1726853772.46308: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.46318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.46332: variable 'omit' from source: magic vars 30583 1726853772.46735: variable 'ansible_distribution_major_version' from source: facts 30583 1726853772.46753: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853772.46767: variable 'omit' from source: magic vars 30583 1726853772.46830: variable 'omit' from source: magic vars 30583 1726853772.46950: variable '_current_interfaces' from source: set_fact 30583 1726853772.47029: variable 'omit' from source: magic vars 30583 1726853772.47082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853772.47239: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853772.47243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853772.47245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853772.47247: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853772.47249: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853772.47252: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.47254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.47349: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853772.47372: Set connection var ansible_timeout to 10 30583 1726853772.47382: Set connection var ansible_connection to ssh 30583 1726853772.47394: Set connection var ansible_shell_executable to /bin/sh 30583 1726853772.47456: Set connection var ansible_shell_type to sh 30583 1726853772.47462: Set connection var ansible_pipelining to False 30583 1726853772.47465: variable 'ansible_shell_executable' from source: unknown 30583 1726853772.47467: variable 'ansible_connection' from source: unknown 30583 1726853772.47469: variable 'ansible_module_compression' from source: unknown 30583 1726853772.47476: variable 'ansible_shell_type' from source: unknown 30583 1726853772.47479: variable 'ansible_shell_executable' from source: unknown 30583 1726853772.47481: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.47483: variable 'ansible_pipelining' from source: unknown 30583 1726853772.47490: variable 'ansible_timeout' from source: unknown 30583 1726853772.47498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.47648: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853772.47674: variable 'omit' from source: magic vars 30583 1726853772.47685: starting attempt loop 30583 1726853772.47787: running the handler 30583 1726853772.47790: handler run complete 30583 1726853772.47792: attempt loop complete, returning result 30583 1726853772.47794: _execute() done 30583 1726853772.47798: dumping result to json 30583 1726853772.47800: done dumping result, returning 30583 1726853772.47802: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [02083763-bbaf-05ea-abc5-000000002112] 30583 1726853772.47804: sending task result for task 02083763-bbaf-05ea-abc5-000000002112 30583 1726853772.47874: done sending task result for task 02083763-bbaf-05ea-abc5-000000002112 30583 1726853772.47877: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30583 1726853772.47940: no more pending results, returning what we have 30583 1726853772.47944: results queue empty 30583 1726853772.47945: checking for any_errors_fatal 30583 1726853772.47960: done checking for any_errors_fatal 30583 1726853772.47961: checking for max_fail_percentage 30583 1726853772.47964: done checking for max_fail_percentage 30583 1726853772.47965: checking to see if all hosts have failed and the running result is not ok 30583 1726853772.47966: done checking to see if all hosts have failed 30583 1726853772.47967: getting the remaining hosts for this loop 30583 1726853772.47969: done getting the remaining hosts for this loop 30583 1726853772.47978: getting the next task for host managed_node2 30583 1726853772.47991: done getting next task for host managed_node2 30583 1726853772.47994: ^ task is: TASK: Show current_interfaces 30583 1726853772.47998: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853772.48004: getting variables 30583 1726853772.48005: in VariableManager get_vars() 30583 1726853772.48049: Calling all_inventory to load vars for managed_node2 30583 1726853772.48052: Calling groups_inventory to load vars for managed_node2 30583 1726853772.48056: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853772.48172: Calling all_plugins_play to load vars for managed_node2 30583 1726853772.48179: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853772.48183: Calling groups_plugins_play to load vars for managed_node2 30583 1726853772.49811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.51505: done with get_vars() 30583 1726853772.51527: done getting variables 30583 1726853772.51595: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:36:12 -0400 (0:00:00.063) 0:01:47.853 ****** 30583 1726853772.51629: entering _queue_task() for managed_node2/debug 30583 1726853772.51982: worker is 1 (out of 1 available) 30583 1726853772.52106: exiting _queue_task() for managed_node2/debug 30583 1726853772.52120: done queuing things up, now waiting for results queue to drain 30583 1726853772.52121: waiting for pending results... 30583 1726853772.52493: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 30583 1726853772.52498: in run() - task 02083763-bbaf-05ea-abc5-0000000020d7 30583 1726853772.52501: variable 'ansible_search_path' from source: unknown 30583 1726853772.52504: variable 'ansible_search_path' from source: unknown 30583 1726853772.52538: calling self._execute() 30583 1726853772.52648: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.52660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.52680: variable 'omit' from source: magic vars 30583 1726853772.53107: variable 'ansible_distribution_major_version' from source: facts 30583 1726853772.53129: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853772.53144: variable 'omit' from source: magic vars 30583 1726853772.53190: variable 'omit' from source: magic vars 30583 1726853772.53297: variable 'current_interfaces' from source: set_fact 30583 1726853772.53570: variable 'omit' from source: magic vars 30583 1726853772.53575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853772.53578: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853772.53580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853772.53583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853772.53654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853772.53695: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853772.53708: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.53716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.53921: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853772.53925: Set connection var ansible_timeout to 10 30583 1726853772.53927: Set connection var ansible_connection to ssh 30583 1726853772.53929: Set connection var ansible_shell_executable to /bin/sh 30583 1726853772.53931: Set connection var ansible_shell_type to sh 30583 1726853772.53933: Set connection var ansible_pipelining to False 30583 1726853772.53935: variable 'ansible_shell_executable' from source: unknown 30583 1726853772.53937: variable 'ansible_connection' from source: unknown 30583 1726853772.53939: variable 'ansible_module_compression' from source: unknown 30583 1726853772.53941: variable 'ansible_shell_type' from source: unknown 30583 1726853772.53943: variable 'ansible_shell_executable' from source: unknown 30583 1726853772.53944: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.53946: variable 'ansible_pipelining' from source: unknown 30583 1726853772.53948: variable 'ansible_timeout' from source: unknown 30583 1726853772.53950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.54088: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853772.54109: variable 'omit' from source: magic vars 30583 1726853772.54121: starting attempt loop 30583 1726853772.54128: running the handler 30583 1726853772.54188: handler run complete 30583 1726853772.54207: attempt loop complete, returning result 30583 1726853772.54214: _execute() done 30583 1726853772.54221: dumping result to json 30583 1726853772.54229: done dumping result, returning 30583 1726853772.54250: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [02083763-bbaf-05ea-abc5-0000000020d7] 30583 1726853772.54260: sending task result for task 02083763-bbaf-05ea-abc5-0000000020d7 30583 1726853772.54426: done sending task result for task 02083763-bbaf-05ea-abc5-0000000020d7 30583 1726853772.54430: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30583 1726853772.54486: no more pending results, returning what we have 30583 1726853772.54489: results queue empty 30583 1726853772.54490: checking for any_errors_fatal 30583 1726853772.54496: done checking for any_errors_fatal 30583 1726853772.54496: checking for max_fail_percentage 30583 1726853772.54498: done checking for max_fail_percentage 30583 1726853772.54499: checking to see if all hosts have failed and the running result is not ok 30583 1726853772.54500: done checking to see if all hosts have failed 30583 1726853772.54500: getting the remaining hosts for this loop 30583 1726853772.54502: done getting the remaining hosts for this loop 30583 1726853772.54505: getting the next task for host managed_node2 30583 1726853772.54515: done getting next task for host managed_node2 30583 1726853772.54518: ^ task is: TASK: Setup 30583 1726853772.54521: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853772.54526: getting variables 30583 1726853772.54528: in VariableManager get_vars() 30583 1726853772.54574: Calling all_inventory to load vars for managed_node2 30583 1726853772.54577: Calling groups_inventory to load vars for managed_node2 30583 1726853772.54582: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853772.54592: Calling all_plugins_play to load vars for managed_node2 30583 1726853772.54594: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853772.54597: Calling groups_plugins_play to load vars for managed_node2 30583 1726853772.56626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.57488: done with get_vars() 30583 1726853772.57504: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 13:36:12 -0400 (0:00:00.059) 0:01:47.912 ****** 30583 1726853772.57575: entering _queue_task() for managed_node2/include_tasks 30583 1726853772.57826: worker is 1 (out of 1 available) 30583 1726853772.57839: exiting _queue_task() for managed_node2/include_tasks 30583 1726853772.57852: done queuing things up, now waiting for results queue to drain 30583 1726853772.57853: waiting for pending results... 30583 1726853772.58188: running TaskExecutor() for managed_node2/TASK: Setup 30583 1726853772.58193: in run() - task 02083763-bbaf-05ea-abc5-0000000020b0 30583 1726853772.58212: variable 'ansible_search_path' from source: unknown 30583 1726853772.58220: variable 'ansible_search_path' from source: unknown 30583 1726853772.58272: variable 'lsr_setup' from source: include params 30583 1726853772.58499: variable 'lsr_setup' from source: include params 30583 1726853772.58638: variable 'omit' from source: magic vars 30583 1726853772.58747: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.58765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.58784: variable 'omit' from source: magic vars 30583 1726853772.59053: variable 'ansible_distribution_major_version' from source: facts 30583 1726853772.59080: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853772.59088: variable 'item' from source: unknown 30583 1726853772.59150: variable 'item' from source: unknown 30583 1726853772.59182: variable 'item' from source: unknown 30583 1726853772.59219: variable 'item' from source: unknown 30583 1726853772.59352: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.59355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.59357: variable 'omit' from source: magic vars 30583 1726853772.59440: variable 'ansible_distribution_major_version' from source: facts 30583 1726853772.59443: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853772.59450: variable 'item' from source: unknown 30583 1726853772.59499: variable 'item' from source: unknown 30583 1726853772.59522: variable 'item' from source: unknown 30583 1726853772.59566: variable 'item' from source: unknown 30583 1726853772.59628: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.59631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.59640: variable 'omit' from source: magic vars 30583 1726853772.59735: variable 'ansible_distribution_major_version' from source: facts 30583 1726853772.59738: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853772.59748: variable 'item' from source: unknown 30583 1726853772.59792: variable 'item' from source: unknown 30583 1726853772.59810: variable 'item' from source: unknown 30583 1726853772.59858: variable 'item' from source: unknown 30583 1726853772.59913: dumping result to json 30583 1726853772.59916: done dumping result, returning 30583 1726853772.59919: done running TaskExecutor() for managed_node2/TASK: Setup [02083763-bbaf-05ea-abc5-0000000020b0] 30583 1726853772.59922: sending task result for task 02083763-bbaf-05ea-abc5-0000000020b0 30583 1726853772.59952: done sending task result for task 02083763-bbaf-05ea-abc5-0000000020b0 30583 1726853772.59954: WORKER PROCESS EXITING 30583 1726853772.59985: no more pending results, returning what we have 30583 1726853772.59990: in VariableManager get_vars() 30583 1726853772.60036: Calling all_inventory to load vars for managed_node2 30583 1726853772.60039: Calling groups_inventory to load vars for managed_node2 30583 1726853772.60042: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853772.60055: Calling all_plugins_play to load vars for managed_node2 30583 1726853772.60058: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853772.60061: Calling groups_plugins_play to load vars for managed_node2 30583 1726853772.60885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.62279: done with get_vars() 30583 1726853772.62298: variable 'ansible_search_path' from source: unknown 30583 1726853772.62299: variable 'ansible_search_path' from source: unknown 30583 1726853772.62338: variable 'ansible_search_path' from source: unknown 30583 1726853772.62340: variable 'ansible_search_path' from source: unknown 30583 1726853772.62369: variable 'ansible_search_path' from source: unknown 30583 1726853772.62370: variable 'ansible_search_path' from source: unknown 30583 1726853772.62399: we have included files to process 30583 1726853772.62400: generating all_blocks data 30583 1726853772.62402: done generating all_blocks data 30583 1726853772.62406: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853772.62407: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853772.62410: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30583 1726853772.62635: done processing included file 30583 1726853772.62638: iterating over new_blocks loaded from include file 30583 1726853772.62639: in VariableManager get_vars() 30583 1726853772.62656: done with get_vars() 30583 1726853772.62658: filtering new block on tags 30583 1726853772.62694: done filtering new block on tags 30583 1726853772.62697: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 30583 1726853772.62702: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30583 1726853772.62703: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30583 1726853772.62706: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30583 1726853772.62800: done processing included file 30583 1726853772.62802: iterating over new_blocks loaded from include file 30583 1726853772.62803: in VariableManager get_vars() 30583 1726853772.62820: done with get_vars() 30583 1726853772.62821: filtering new block on tags 30583 1726853772.62842: done filtering new block on tags 30583 1726853772.62844: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node2 => (item=tasks/activate_profile.yml) 30583 1726853772.62848: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30583 1726853772.62849: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30583 1726853772.62852: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30583 1726853772.62939: done processing included file 30583 1726853772.62941: iterating over new_blocks loaded from include file 30583 1726853772.62943: in VariableManager get_vars() 30583 1726853772.62958: done with get_vars() 30583 1726853772.62959: filtering new block on tags 30583 1726853772.62982: done filtering new block on tags 30583 1726853772.62984: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node2 => (item=tasks/remove+down_profile.yml) 30583 1726853772.62988: extending task lists for all hosts with included blocks 30583 1726853772.63740: done extending task lists 30583 1726853772.63741: done processing included files 30583 1726853772.63742: results queue empty 30583 1726853772.63743: checking for any_errors_fatal 30583 1726853772.63746: done checking for any_errors_fatal 30583 1726853772.63747: checking for max_fail_percentage 30583 1726853772.63748: done checking for max_fail_percentage 30583 1726853772.63749: checking to see if all hosts have failed and the running result is not ok 30583 1726853772.63750: done checking to see if all hosts have failed 30583 1726853772.63751: getting the remaining hosts for this loop 30583 1726853772.63752: done getting the remaining hosts for this loop 30583 1726853772.63754: getting the next task for host managed_node2 30583 1726853772.63758: done getting next task for host managed_node2 30583 1726853772.63760: ^ task is: TASK: Include network role 30583 1726853772.63763: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853772.63766: getting variables 30583 1726853772.63767: in VariableManager get_vars() 30583 1726853772.63779: Calling all_inventory to load vars for managed_node2 30583 1726853772.63781: Calling groups_inventory to load vars for managed_node2 30583 1726853772.63784: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853772.63790: Calling all_plugins_play to load vars for managed_node2 30583 1726853772.63792: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853772.63794: Calling groups_plugins_play to load vars for managed_node2 30583 1726853772.64885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.66407: done with get_vars() 30583 1726853772.66432: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 13:36:12 -0400 (0:00:00.089) 0:01:48.002 ****** 30583 1726853772.66512: entering _queue_task() for managed_node2/include_role 30583 1726853772.66884: worker is 1 (out of 1 available) 30583 1726853772.66898: exiting _queue_task() for managed_node2/include_role 30583 1726853772.66912: done queuing things up, now waiting for results queue to drain 30583 1726853772.66913: waiting for pending results... 30583 1726853772.67296: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853772.67309: in run() - task 02083763-bbaf-05ea-abc5-000000002139 30583 1726853772.67330: variable 'ansible_search_path' from source: unknown 30583 1726853772.67336: variable 'ansible_search_path' from source: unknown 30583 1726853772.67391: calling self._execute() 30583 1726853772.67506: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.67520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.67536: variable 'omit' from source: magic vars 30583 1726853772.67960: variable 'ansible_distribution_major_version' from source: facts 30583 1726853772.68043: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853772.68046: _execute() done 30583 1726853772.68048: dumping result to json 30583 1726853772.68051: done dumping result, returning 30583 1726853772.68054: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-000000002139] 30583 1726853772.68056: sending task result for task 02083763-bbaf-05ea-abc5-000000002139 30583 1726853772.68264: done sending task result for task 02083763-bbaf-05ea-abc5-000000002139 30583 1726853772.68268: WORKER PROCESS EXITING 30583 1726853772.68297: no more pending results, returning what we have 30583 1726853772.68302: in VariableManager get_vars() 30583 1726853772.68350: Calling all_inventory to load vars for managed_node2 30583 1726853772.68353: Calling groups_inventory to load vars for managed_node2 30583 1726853772.68357: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853772.68368: Calling all_plugins_play to load vars for managed_node2 30583 1726853772.68373: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853772.68376: Calling groups_plugins_play to load vars for managed_node2 30583 1726853772.70003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.71712: done with get_vars() 30583 1726853772.71746: variable 'ansible_search_path' from source: unknown 30583 1726853772.71747: variable 'ansible_search_path' from source: unknown 30583 1726853772.71967: variable 'omit' from source: magic vars 30583 1726853772.72008: variable 'omit' from source: magic vars 30583 1726853772.72024: variable 'omit' from source: magic vars 30583 1726853772.72028: we have included files to process 30583 1726853772.72029: generating all_blocks data 30583 1726853772.72031: done generating all_blocks data 30583 1726853772.72032: processing included file: fedora.linux_system_roles.network 30583 1726853772.72053: in VariableManager get_vars() 30583 1726853772.72081: done with get_vars() 30583 1726853772.72113: in VariableManager get_vars() 30583 1726853772.72132: done with get_vars() 30583 1726853772.72184: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853772.72321: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853772.72410: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853772.72893: in VariableManager get_vars() 30583 1726853772.72914: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853772.74778: iterating over new_blocks loaded from include file 30583 1726853772.74779: in VariableManager get_vars() 30583 1726853772.74793: done with get_vars() 30583 1726853772.74794: filtering new block on tags 30583 1726853772.74965: done filtering new block on tags 30583 1726853772.74967: in VariableManager get_vars() 30583 1726853772.74979: done with get_vars() 30583 1726853772.74980: filtering new block on tags 30583 1726853772.74990: done filtering new block on tags 30583 1726853772.74992: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853772.74996: extending task lists for all hosts with included blocks 30583 1726853772.75094: done extending task lists 30583 1726853772.75095: done processing included files 30583 1726853772.75095: results queue empty 30583 1726853772.75096: checking for any_errors_fatal 30583 1726853772.75098: done checking for any_errors_fatal 30583 1726853772.75100: checking for max_fail_percentage 30583 1726853772.75101: done checking for max_fail_percentage 30583 1726853772.75101: checking to see if all hosts have failed and the running result is not ok 30583 1726853772.75102: done checking to see if all hosts have failed 30583 1726853772.75102: getting the remaining hosts for this loop 30583 1726853772.75103: done getting the remaining hosts for this loop 30583 1726853772.75105: getting the next task for host managed_node2 30583 1726853772.75108: done getting next task for host managed_node2 30583 1726853772.75110: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853772.75112: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853772.75119: getting variables 30583 1726853772.75119: in VariableManager get_vars() 30583 1726853772.75128: Calling all_inventory to load vars for managed_node2 30583 1726853772.75130: Calling groups_inventory to load vars for managed_node2 30583 1726853772.75131: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853772.75135: Calling all_plugins_play to load vars for managed_node2 30583 1726853772.75136: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853772.75138: Calling groups_plugins_play to load vars for managed_node2 30583 1726853772.75806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.77269: done with get_vars() 30583 1726853772.77300: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:36:12 -0400 (0:00:00.108) 0:01:48.111 ****** 30583 1726853772.77382: entering _queue_task() for managed_node2/include_tasks 30583 1726853772.77786: worker is 1 (out of 1 available) 30583 1726853772.77799: exiting _queue_task() for managed_node2/include_tasks 30583 1726853772.77810: done queuing things up, now waiting for results queue to drain 30583 1726853772.77811: waiting for pending results... 30583 1726853772.78132: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853772.78231: in run() - task 02083763-bbaf-05ea-abc5-0000000021a3 30583 1726853772.78242: variable 'ansible_search_path' from source: unknown 30583 1726853772.78245: variable 'ansible_search_path' from source: unknown 30583 1726853772.78280: calling self._execute() 30583 1726853772.78367: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.78374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.78384: variable 'omit' from source: magic vars 30583 1726853772.78682: variable 'ansible_distribution_major_version' from source: facts 30583 1726853772.78691: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853772.78696: _execute() done 30583 1726853772.78701: dumping result to json 30583 1726853772.78703: done dumping result, returning 30583 1726853772.78712: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-0000000021a3] 30583 1726853772.78714: sending task result for task 02083763-bbaf-05ea-abc5-0000000021a3 30583 1726853772.78801: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021a3 30583 1726853772.78804: WORKER PROCESS EXITING 30583 1726853772.78855: no more pending results, returning what we have 30583 1726853772.78860: in VariableManager get_vars() 30583 1726853772.78916: Calling all_inventory to load vars for managed_node2 30583 1726853772.78922: Calling groups_inventory to load vars for managed_node2 30583 1726853772.78924: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853772.78936: Calling all_plugins_play to load vars for managed_node2 30583 1726853772.78939: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853772.78941: Calling groups_plugins_play to load vars for managed_node2 30583 1726853772.79861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.81198: done with get_vars() 30583 1726853772.81217: variable 'ansible_search_path' from source: unknown 30583 1726853772.81218: variable 'ansible_search_path' from source: unknown 30583 1726853772.81246: we have included files to process 30583 1726853772.81247: generating all_blocks data 30583 1726853772.81248: done generating all_blocks data 30583 1726853772.81251: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853772.81252: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853772.81253: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853772.81646: done processing included file 30583 1726853772.81647: iterating over new_blocks loaded from include file 30583 1726853772.81648: in VariableManager get_vars() 30583 1726853772.81668: done with get_vars() 30583 1726853772.81669: filtering new block on tags 30583 1726853772.81690: done filtering new block on tags 30583 1726853772.81692: in VariableManager get_vars() 30583 1726853772.81706: done with get_vars() 30583 1726853772.81707: filtering new block on tags 30583 1726853772.81735: done filtering new block on tags 30583 1726853772.81737: in VariableManager get_vars() 30583 1726853772.81751: done with get_vars() 30583 1726853772.81752: filtering new block on tags 30583 1726853772.81780: done filtering new block on tags 30583 1726853772.81782: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853772.81786: extending task lists for all hosts with included blocks 30583 1726853772.82762: done extending task lists 30583 1726853772.82764: done processing included files 30583 1726853772.82764: results queue empty 30583 1726853772.82765: checking for any_errors_fatal 30583 1726853772.82767: done checking for any_errors_fatal 30583 1726853772.82768: checking for max_fail_percentage 30583 1726853772.82769: done checking for max_fail_percentage 30583 1726853772.82770: checking to see if all hosts have failed and the running result is not ok 30583 1726853772.82770: done checking to see if all hosts have failed 30583 1726853772.82772: getting the remaining hosts for this loop 30583 1726853772.82773: done getting the remaining hosts for this loop 30583 1726853772.82775: getting the next task for host managed_node2 30583 1726853772.82779: done getting next task for host managed_node2 30583 1726853772.82781: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853772.82783: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853772.82792: getting variables 30583 1726853772.82792: in VariableManager get_vars() 30583 1726853772.82806: Calling all_inventory to load vars for managed_node2 30583 1726853772.82807: Calling groups_inventory to load vars for managed_node2 30583 1726853772.82808: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853772.82812: Calling all_plugins_play to load vars for managed_node2 30583 1726853772.82814: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853772.82816: Calling groups_plugins_play to load vars for managed_node2 30583 1726853772.83506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.84374: done with get_vars() 30583 1726853772.84390: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:36:12 -0400 (0:00:00.070) 0:01:48.181 ****** 30583 1726853772.84448: entering _queue_task() for managed_node2/setup 30583 1726853772.84733: worker is 1 (out of 1 available) 30583 1726853772.84747: exiting _queue_task() for managed_node2/setup 30583 1726853772.84763: done queuing things up, now waiting for results queue to drain 30583 1726853772.84765: waiting for pending results... 30583 1726853772.84964: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853772.85085: in run() - task 02083763-bbaf-05ea-abc5-000000002200 30583 1726853772.85096: variable 'ansible_search_path' from source: unknown 30583 1726853772.85102: variable 'ansible_search_path' from source: unknown 30583 1726853772.85131: calling self._execute() 30583 1726853772.85209: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.85214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.85223: variable 'omit' from source: magic vars 30583 1726853772.85509: variable 'ansible_distribution_major_version' from source: facts 30583 1726853772.85518: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853772.85675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853772.87189: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853772.87234: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853772.87263: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853772.87293: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853772.87312: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853772.87373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853772.87397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853772.87415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853772.87440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853772.87451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853772.87494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853772.87510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853772.87526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853772.87550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853772.87563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853772.87675: variable '__network_required_facts' from source: role '' defaults 30583 1726853772.87683: variable 'ansible_facts' from source: unknown 30583 1726853772.88154: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853772.88158: when evaluation is False, skipping this task 30583 1726853772.88160: _execute() done 30583 1726853772.88165: dumping result to json 30583 1726853772.88167: done dumping result, returning 30583 1726853772.88178: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-000000002200] 30583 1726853772.88182: sending task result for task 02083763-bbaf-05ea-abc5-000000002200 30583 1726853772.88273: done sending task result for task 02083763-bbaf-05ea-abc5-000000002200 30583 1726853772.88276: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853772.88318: no more pending results, returning what we have 30583 1726853772.88322: results queue empty 30583 1726853772.88323: checking for any_errors_fatal 30583 1726853772.88324: done checking for any_errors_fatal 30583 1726853772.88325: checking for max_fail_percentage 30583 1726853772.88327: done checking for max_fail_percentage 30583 1726853772.88328: checking to see if all hosts have failed and the running result is not ok 30583 1726853772.88328: done checking to see if all hosts have failed 30583 1726853772.88329: getting the remaining hosts for this loop 30583 1726853772.88331: done getting the remaining hosts for this loop 30583 1726853772.88334: getting the next task for host managed_node2 30583 1726853772.88347: done getting next task for host managed_node2 30583 1726853772.88351: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853772.88357: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853772.88388: getting variables 30583 1726853772.88390: in VariableManager get_vars() 30583 1726853772.88436: Calling all_inventory to load vars for managed_node2 30583 1726853772.88439: Calling groups_inventory to load vars for managed_node2 30583 1726853772.88441: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853772.88450: Calling all_plugins_play to load vars for managed_node2 30583 1726853772.88452: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853772.88461: Calling groups_plugins_play to load vars for managed_node2 30583 1726853772.89287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.90178: done with get_vars() 30583 1726853772.90197: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:36:12 -0400 (0:00:00.058) 0:01:48.239 ****** 30583 1726853772.90274: entering _queue_task() for managed_node2/stat 30583 1726853772.90534: worker is 1 (out of 1 available) 30583 1726853772.90550: exiting _queue_task() for managed_node2/stat 30583 1726853772.90563: done queuing things up, now waiting for results queue to drain 30583 1726853772.90565: waiting for pending results... 30583 1726853772.90773: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853772.90882: in run() - task 02083763-bbaf-05ea-abc5-000000002202 30583 1726853772.90894: variable 'ansible_search_path' from source: unknown 30583 1726853772.90899: variable 'ansible_search_path' from source: unknown 30583 1726853772.90929: calling self._execute() 30583 1726853772.91010: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.91014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.91024: variable 'omit' from source: magic vars 30583 1726853772.91306: variable 'ansible_distribution_major_version' from source: facts 30583 1726853772.91316: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853772.91436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853772.91642: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853772.91678: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853772.91703: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853772.91728: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853772.91799: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853772.91816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853772.91835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853772.91853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853772.91920: variable '__network_is_ostree' from source: set_fact 30583 1726853772.91926: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853772.91929: when evaluation is False, skipping this task 30583 1726853772.91931: _execute() done 30583 1726853772.91933: dumping result to json 30583 1726853772.91936: done dumping result, returning 30583 1726853772.91944: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-000000002202] 30583 1726853772.91948: sending task result for task 02083763-bbaf-05ea-abc5-000000002202 30583 1726853772.92035: done sending task result for task 02083763-bbaf-05ea-abc5-000000002202 30583 1726853772.92038: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853772.92091: no more pending results, returning what we have 30583 1726853772.92095: results queue empty 30583 1726853772.92096: checking for any_errors_fatal 30583 1726853772.92107: done checking for any_errors_fatal 30583 1726853772.92107: checking for max_fail_percentage 30583 1726853772.92109: done checking for max_fail_percentage 30583 1726853772.92110: checking to see if all hosts have failed and the running result is not ok 30583 1726853772.92111: done checking to see if all hosts have failed 30583 1726853772.92112: getting the remaining hosts for this loop 30583 1726853772.92113: done getting the remaining hosts for this loop 30583 1726853772.92117: getting the next task for host managed_node2 30583 1726853772.92125: done getting next task for host managed_node2 30583 1726853772.92129: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853772.92135: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853772.92155: getting variables 30583 1726853772.92157: in VariableManager get_vars() 30583 1726853772.92211: Calling all_inventory to load vars for managed_node2 30583 1726853772.92214: Calling groups_inventory to load vars for managed_node2 30583 1726853772.92216: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853772.92225: Calling all_plugins_play to load vars for managed_node2 30583 1726853772.92227: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853772.92230: Calling groups_plugins_play to load vars for managed_node2 30583 1726853772.93149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.94023: done with get_vars() 30583 1726853772.94040: done getting variables 30583 1726853772.94086: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:36:12 -0400 (0:00:00.038) 0:01:48.278 ****** 30583 1726853772.94113: entering _queue_task() for managed_node2/set_fact 30583 1726853772.94376: worker is 1 (out of 1 available) 30583 1726853772.94389: exiting _queue_task() for managed_node2/set_fact 30583 1726853772.94402: done queuing things up, now waiting for results queue to drain 30583 1726853772.94403: waiting for pending results... 30583 1726853772.94591: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853772.94685: in run() - task 02083763-bbaf-05ea-abc5-000000002203 30583 1726853772.94696: variable 'ansible_search_path' from source: unknown 30583 1726853772.94699: variable 'ansible_search_path' from source: unknown 30583 1726853772.94727: calling self._execute() 30583 1726853772.94809: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.94812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.94821: variable 'omit' from source: magic vars 30583 1726853772.95101: variable 'ansible_distribution_major_version' from source: facts 30583 1726853772.95110: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853772.95226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853772.95421: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853772.95454: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853772.95482: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853772.95508: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853772.95574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853772.95592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853772.95612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853772.95630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853772.95696: variable '__network_is_ostree' from source: set_fact 30583 1726853772.95702: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853772.95705: when evaluation is False, skipping this task 30583 1726853772.95707: _execute() done 30583 1726853772.95710: dumping result to json 30583 1726853772.95712: done dumping result, returning 30583 1726853772.95723: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-000000002203] 30583 1726853772.95726: sending task result for task 02083763-bbaf-05ea-abc5-000000002203 30583 1726853772.95813: done sending task result for task 02083763-bbaf-05ea-abc5-000000002203 30583 1726853772.95816: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853772.95869: no more pending results, returning what we have 30583 1726853772.95874: results queue empty 30583 1726853772.95876: checking for any_errors_fatal 30583 1726853772.95882: done checking for any_errors_fatal 30583 1726853772.95883: checking for max_fail_percentage 30583 1726853772.95885: done checking for max_fail_percentage 30583 1726853772.95886: checking to see if all hosts have failed and the running result is not ok 30583 1726853772.95887: done checking to see if all hosts have failed 30583 1726853772.95888: getting the remaining hosts for this loop 30583 1726853772.95889: done getting the remaining hosts for this loop 30583 1726853772.95893: getting the next task for host managed_node2 30583 1726853772.95904: done getting next task for host managed_node2 30583 1726853772.95907: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853772.95913: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853772.95932: getting variables 30583 1726853772.95933: in VariableManager get_vars() 30583 1726853772.95978: Calling all_inventory to load vars for managed_node2 30583 1726853772.95981: Calling groups_inventory to load vars for managed_node2 30583 1726853772.95983: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853772.95992: Calling all_plugins_play to load vars for managed_node2 30583 1726853772.95994: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853772.95997: Calling groups_plugins_play to load vars for managed_node2 30583 1726853772.96794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853772.97806: done with get_vars() 30583 1726853772.97825: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:36:12 -0400 (0:00:00.037) 0:01:48.316 ****** 30583 1726853772.97900: entering _queue_task() for managed_node2/service_facts 30583 1726853772.98165: worker is 1 (out of 1 available) 30583 1726853772.98181: exiting _queue_task() for managed_node2/service_facts 30583 1726853772.98194: done queuing things up, now waiting for results queue to drain 30583 1726853772.98196: waiting for pending results... 30583 1726853772.98405: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853772.98681: in run() - task 02083763-bbaf-05ea-abc5-000000002205 30583 1726853772.98685: variable 'ansible_search_path' from source: unknown 30583 1726853772.98688: variable 'ansible_search_path' from source: unknown 30583 1726853772.98691: calling self._execute() 30583 1726853772.98725: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.98737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.98753: variable 'omit' from source: magic vars 30583 1726853772.99187: variable 'ansible_distribution_major_version' from source: facts 30583 1726853772.99198: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853772.99210: variable 'omit' from source: magic vars 30583 1726853772.99292: variable 'omit' from source: magic vars 30583 1726853772.99330: variable 'omit' from source: magic vars 30583 1726853772.99433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853772.99503: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853772.99546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853772.99563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853772.99574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853772.99601: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853772.99604: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.99608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.99690: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853772.99694: Set connection var ansible_timeout to 10 30583 1726853772.99697: Set connection var ansible_connection to ssh 30583 1726853772.99702: Set connection var ansible_shell_executable to /bin/sh 30583 1726853772.99705: Set connection var ansible_shell_type to sh 30583 1726853772.99713: Set connection var ansible_pipelining to False 30583 1726853772.99732: variable 'ansible_shell_executable' from source: unknown 30583 1726853772.99734: variable 'ansible_connection' from source: unknown 30583 1726853772.99737: variable 'ansible_module_compression' from source: unknown 30583 1726853772.99741: variable 'ansible_shell_type' from source: unknown 30583 1726853772.99743: variable 'ansible_shell_executable' from source: unknown 30583 1726853772.99745: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853772.99748: variable 'ansible_pipelining' from source: unknown 30583 1726853772.99750: variable 'ansible_timeout' from source: unknown 30583 1726853772.99755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853772.99904: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853772.99914: variable 'omit' from source: magic vars 30583 1726853772.99919: starting attempt loop 30583 1726853772.99921: running the handler 30583 1726853772.99933: _low_level_execute_command(): starting 30583 1726853772.99941: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853773.00438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853773.00463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853773.00467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853773.00469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853773.00526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853773.00529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853773.00534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853773.00611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853773.02354: stdout chunk (state=3): >>>/root <<< 30583 1726853773.02446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853773.02676: stderr chunk (state=3): >>><<< 30583 1726853773.02680: stdout chunk (state=3): >>><<< 30583 1726853773.02685: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853773.02687: _low_level_execute_command(): starting 30583 1726853773.02691: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342 `" && echo ansible-tmp-1726853773.0251558-35649-149562113215342="` echo /root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342 `" ) && sleep 0' 30583 1726853773.03198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853773.03204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853773.03292: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853773.03307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853773.03330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853773.03424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853773.05494: stdout chunk (state=3): >>>ansible-tmp-1726853773.0251558-35649-149562113215342=/root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342 <<< 30583 1726853773.05620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853773.05652: stderr chunk (state=3): >>><<< 30583 1726853773.05655: stdout chunk (state=3): >>><<< 30583 1726853773.05674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853773.0251558-35649-149562113215342=/root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853773.05876: variable 'ansible_module_compression' from source: unknown 30583 1726853773.05879: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853773.05882: variable 'ansible_facts' from source: unknown 30583 1726853773.05923: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342/AnsiballZ_service_facts.py 30583 1726853773.06131: Sending initial data 30583 1726853773.06134: Sent initial data (162 bytes) 30583 1726853773.06736: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853773.06768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853773.06882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853773.06906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853773.07010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853773.08719: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853773.08777: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853773.08858: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpwjf6tn4d /root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342/AnsiballZ_service_facts.py <<< 30583 1726853773.08864: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342/AnsiballZ_service_facts.py" <<< 30583 1726853773.08925: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpwjf6tn4d" to remote "/root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342/AnsiballZ_service_facts.py" <<< 30583 1726853773.08929: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342/AnsiballZ_service_facts.py" <<< 30583 1726853773.09777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853773.09916: stdout chunk (state=3): >>><<< 30583 1726853773.09919: stderr chunk (state=3): >>><<< 30583 1726853773.09922: done transferring module to remote 30583 1726853773.09924: _low_level_execute_command(): starting 30583 1726853773.09933: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342/ /root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342/AnsiballZ_service_facts.py && sleep 0' 30583 1726853773.10377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853773.10380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853773.10383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853773.10385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853773.10441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853773.10445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853773.10525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853773.12586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853773.12590: stdout chunk (state=3): >>><<< 30583 1726853773.12592: stderr chunk (state=3): >>><<< 30583 1726853773.12594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853773.12596: _low_level_execute_command(): starting 30583 1726853773.12598: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342/AnsiballZ_service_facts.py && sleep 0' 30583 1726853773.13120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853773.13140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853773.13150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853773.13192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853773.13209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853773.13290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853774.79109: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30583 1726853774.79119: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30583 1726853774.79151: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 30583 1726853774.79169: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 30583 1726853774.79185: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "<<< 30583 1726853774.79193: stdout chunk (state=3): >>>systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853774.80818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853774.80848: stderr chunk (state=3): >>><<< 30583 1726853774.80852: stdout chunk (state=3): >>><<< 30583 1726853774.80876: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853774.81315: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853774.81323: _low_level_execute_command(): starting 30583 1726853774.81329: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853773.0251558-35649-149562113215342/ > /dev/null 2>&1 && sleep 0' 30583 1726853774.81762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853774.81766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853774.81794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853774.81797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853774.81800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853774.81856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853774.81865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853774.81867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853774.81937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853774.83843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853774.83867: stderr chunk (state=3): >>><<< 30583 1726853774.83872: stdout chunk (state=3): >>><<< 30583 1726853774.83885: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853774.83891: handler run complete 30583 1726853774.84011: variable 'ansible_facts' from source: unknown 30583 1726853774.84109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853774.84393: variable 'ansible_facts' from source: unknown 30583 1726853774.84472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853774.84585: attempt loop complete, returning result 30583 1726853774.84588: _execute() done 30583 1726853774.84591: dumping result to json 30583 1726853774.84629: done dumping result, returning 30583 1726853774.84638: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-000000002205] 30583 1726853774.84643: sending task result for task 02083763-bbaf-05ea-abc5-000000002205 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853774.85278: no more pending results, returning what we have 30583 1726853774.85281: results queue empty 30583 1726853774.85282: checking for any_errors_fatal 30583 1726853774.85286: done checking for any_errors_fatal 30583 1726853774.85287: checking for max_fail_percentage 30583 1726853774.85290: done checking for max_fail_percentage 30583 1726853774.85291: checking to see if all hosts have failed and the running result is not ok 30583 1726853774.85292: done checking to see if all hosts have failed 30583 1726853774.85292: getting the remaining hosts for this loop 30583 1726853774.85294: done getting the remaining hosts for this loop 30583 1726853774.85297: getting the next task for host managed_node2 30583 1726853774.85304: done getting next task for host managed_node2 30583 1726853774.85307: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853774.85312: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853774.85323: getting variables 30583 1726853774.85324: in VariableManager get_vars() 30583 1726853774.85354: Calling all_inventory to load vars for managed_node2 30583 1726853774.85356: Calling groups_inventory to load vars for managed_node2 30583 1726853774.85360: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853774.85367: Calling all_plugins_play to load vars for managed_node2 30583 1726853774.85369: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853774.85373: Calling groups_plugins_play to load vars for managed_node2 30583 1726853774.85887: done sending task result for task 02083763-bbaf-05ea-abc5-000000002205 30583 1726853774.86196: WORKER PROCESS EXITING 30583 1726853774.86208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853774.87075: done with get_vars() 30583 1726853774.87093: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:36:14 -0400 (0:00:01.892) 0:01:50.208 ****** 30583 1726853774.87165: entering _queue_task() for managed_node2/package_facts 30583 1726853774.87430: worker is 1 (out of 1 available) 30583 1726853774.87445: exiting _queue_task() for managed_node2/package_facts 30583 1726853774.87462: done queuing things up, now waiting for results queue to drain 30583 1726853774.87464: waiting for pending results... 30583 1726853774.87652: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853774.87760: in run() - task 02083763-bbaf-05ea-abc5-000000002206 30583 1726853774.87770: variable 'ansible_search_path' from source: unknown 30583 1726853774.87777: variable 'ansible_search_path' from source: unknown 30583 1726853774.87804: calling self._execute() 30583 1726853774.87885: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853774.87888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853774.87897: variable 'omit' from source: magic vars 30583 1726853774.88179: variable 'ansible_distribution_major_version' from source: facts 30583 1726853774.88190: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853774.88195: variable 'omit' from source: magic vars 30583 1726853774.88252: variable 'omit' from source: magic vars 30583 1726853774.88276: variable 'omit' from source: magic vars 30583 1726853774.88310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853774.88337: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853774.88353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853774.88368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853774.88379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853774.88404: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853774.88407: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853774.88410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853774.88484: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853774.88489: Set connection var ansible_timeout to 10 30583 1726853774.88492: Set connection var ansible_connection to ssh 30583 1726853774.88497: Set connection var ansible_shell_executable to /bin/sh 30583 1726853774.88499: Set connection var ansible_shell_type to sh 30583 1726853774.88508: Set connection var ansible_pipelining to False 30583 1726853774.88525: variable 'ansible_shell_executable' from source: unknown 30583 1726853774.88528: variable 'ansible_connection' from source: unknown 30583 1726853774.88531: variable 'ansible_module_compression' from source: unknown 30583 1726853774.88533: variable 'ansible_shell_type' from source: unknown 30583 1726853774.88536: variable 'ansible_shell_executable' from source: unknown 30583 1726853774.88538: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853774.88540: variable 'ansible_pipelining' from source: unknown 30583 1726853774.88543: variable 'ansible_timeout' from source: unknown 30583 1726853774.88546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853774.88691: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853774.88700: variable 'omit' from source: magic vars 30583 1726853774.88705: starting attempt loop 30583 1726853774.88708: running the handler 30583 1726853774.88719: _low_level_execute_command(): starting 30583 1726853774.88727: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853774.89242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853774.89247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853774.89250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853774.89252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853774.89301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853774.89304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853774.89306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853774.89387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853774.91115: stdout chunk (state=3): >>>/root <<< 30583 1726853774.91218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853774.91247: stderr chunk (state=3): >>><<< 30583 1726853774.91250: stdout chunk (state=3): >>><<< 30583 1726853774.91275: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853774.91287: _low_level_execute_command(): starting 30583 1726853774.91293: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751 `" && echo ansible-tmp-1726853774.912732-35697-248170166031751="` echo /root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751 `" ) && sleep 0' 30583 1726853774.91728: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853774.91731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853774.91734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853774.91744: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853774.91746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853774.91793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853774.91801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853774.91803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853774.91887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853774.93856: stdout chunk (state=3): >>>ansible-tmp-1726853774.912732-35697-248170166031751=/root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751 <<< 30583 1726853774.93965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853774.93995: stderr chunk (state=3): >>><<< 30583 1726853774.93998: stdout chunk (state=3): >>><<< 30583 1726853774.94015: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853774.912732-35697-248170166031751=/root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853774.94054: variable 'ansible_module_compression' from source: unknown 30583 1726853774.94095: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853774.94147: variable 'ansible_facts' from source: unknown 30583 1726853774.94267: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751/AnsiballZ_package_facts.py 30583 1726853774.94375: Sending initial data 30583 1726853774.94379: Sent initial data (161 bytes) 30583 1726853774.94824: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853774.94828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853774.94830: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853774.94832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853774.94834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853774.94836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853774.94877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853774.94896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853774.94964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853774.96646: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30583 1726853774.96651: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853774.96713: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853774.96786: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp8q_jgbtr /root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751/AnsiballZ_package_facts.py <<< 30583 1726853774.96790: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751/AnsiballZ_package_facts.py" <<< 30583 1726853774.96852: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp8q_jgbtr" to remote "/root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751/AnsiballZ_package_facts.py" <<< 30583 1726853774.96855: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751/AnsiballZ_package_facts.py" <<< 30583 1726853774.98277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853774.98314: stderr chunk (state=3): >>><<< 30583 1726853774.98317: stdout chunk (state=3): >>><<< 30583 1726853774.98333: done transferring module to remote 30583 1726853774.98342: _low_level_execute_command(): starting 30583 1726853774.98347: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751/ /root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751/AnsiballZ_package_facts.py && sleep 0' 30583 1726853774.98796: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853774.98799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853774.98803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853774.98805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853774.98812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853774.98814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853774.98853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853774.98856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853774.98935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853775.00844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853775.00874: stderr chunk (state=3): >>><<< 30583 1726853775.00877: stdout chunk (state=3): >>><<< 30583 1726853775.00890: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853775.00894: _low_level_execute_command(): starting 30583 1726853775.00899: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751/AnsiballZ_package_facts.py && sleep 0' 30583 1726853775.01325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853775.01328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853775.01331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853775.01333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853775.01335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853775.01389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853775.01396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853775.01397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853775.01475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853775.46587: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30583 1726853775.46619: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30583 1726853775.46661: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30583 1726853775.46707: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30583 1726853775.46737: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30583 1726853775.46765: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30583 1726853775.46784: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853775.48648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853775.48651: stdout chunk (state=3): >>><<< 30583 1726853775.48654: stderr chunk (state=3): >>><<< 30583 1726853775.48888: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853775.51053: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853775.51070: _low_level_execute_command(): starting 30583 1726853775.51089: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853774.912732-35697-248170166031751/ > /dev/null 2>&1 && sleep 0' 30583 1726853775.51723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853775.51786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853775.51858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853775.51879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853775.51903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853775.52011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853775.54001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853775.54020: stdout chunk (state=3): >>><<< 30583 1726853775.54037: stderr chunk (state=3): >>><<< 30583 1726853775.54176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853775.54180: handler run complete 30583 1726853775.54932: variable 'ansible_facts' from source: unknown 30583 1726853775.55426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853775.57423: variable 'ansible_facts' from source: unknown 30583 1726853775.57887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853775.59477: attempt loop complete, returning result 30583 1726853775.59505: _execute() done 30583 1726853775.59514: dumping result to json 30583 1726853775.59978: done dumping result, returning 30583 1726853775.59982: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-000000002206] 30583 1726853775.59984: sending task result for task 02083763-bbaf-05ea-abc5-000000002206 30583 1726853775.70151: done sending task result for task 02083763-bbaf-05ea-abc5-000000002206 30583 1726853775.70155: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853775.70325: no more pending results, returning what we have 30583 1726853775.70328: results queue empty 30583 1726853775.70329: checking for any_errors_fatal 30583 1726853775.70337: done checking for any_errors_fatal 30583 1726853775.70338: checking for max_fail_percentage 30583 1726853775.70340: done checking for max_fail_percentage 30583 1726853775.70341: checking to see if all hosts have failed and the running result is not ok 30583 1726853775.70341: done checking to see if all hosts have failed 30583 1726853775.70342: getting the remaining hosts for this loop 30583 1726853775.70344: done getting the remaining hosts for this loop 30583 1726853775.70347: getting the next task for host managed_node2 30583 1726853775.70355: done getting next task for host managed_node2 30583 1726853775.70359: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853775.70364: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853775.70381: getting variables 30583 1726853775.70382: in VariableManager get_vars() 30583 1726853775.70417: Calling all_inventory to load vars for managed_node2 30583 1726853775.70420: Calling groups_inventory to load vars for managed_node2 30583 1726853775.70423: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853775.70431: Calling all_plugins_play to load vars for managed_node2 30583 1726853775.70434: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853775.70437: Calling groups_plugins_play to load vars for managed_node2 30583 1726853775.71650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853775.73338: done with get_vars() 30583 1726853775.73361: done getting variables 30583 1726853775.73431: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:36:15 -0400 (0:00:00.863) 0:01:51.071 ****** 30583 1726853775.73478: entering _queue_task() for managed_node2/debug 30583 1726853775.73866: worker is 1 (out of 1 available) 30583 1726853775.73880: exiting _queue_task() for managed_node2/debug 30583 1726853775.73893: done queuing things up, now waiting for results queue to drain 30583 1726853775.73895: waiting for pending results... 30583 1726853775.74294: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853775.74361: in run() - task 02083763-bbaf-05ea-abc5-0000000021a4 30583 1726853775.74392: variable 'ansible_search_path' from source: unknown 30583 1726853775.74401: variable 'ansible_search_path' from source: unknown 30583 1726853775.74443: calling self._execute() 30583 1726853775.74552: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853775.74564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853775.74582: variable 'omit' from source: magic vars 30583 1726853775.74988: variable 'ansible_distribution_major_version' from source: facts 30583 1726853775.75005: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853775.75016: variable 'omit' from source: magic vars 30583 1726853775.75096: variable 'omit' from source: magic vars 30583 1726853775.75201: variable 'network_provider' from source: set_fact 30583 1726853775.75223: variable 'omit' from source: magic vars 30583 1726853775.75277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853775.75314: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853775.75341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853775.75367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853775.75389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853775.75423: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853775.75433: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853775.75443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853775.75556: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853775.75569: Set connection var ansible_timeout to 10 30583 1726853775.75583: Set connection var ansible_connection to ssh 30583 1726853775.75599: Set connection var ansible_shell_executable to /bin/sh 30583 1726853775.75676: Set connection var ansible_shell_type to sh 30583 1726853775.75679: Set connection var ansible_pipelining to False 30583 1726853775.75681: variable 'ansible_shell_executable' from source: unknown 30583 1726853775.75684: variable 'ansible_connection' from source: unknown 30583 1726853775.75686: variable 'ansible_module_compression' from source: unknown 30583 1726853775.75689: variable 'ansible_shell_type' from source: unknown 30583 1726853775.75691: variable 'ansible_shell_executable' from source: unknown 30583 1726853775.75697: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853775.75703: variable 'ansible_pipelining' from source: unknown 30583 1726853775.75705: variable 'ansible_timeout' from source: unknown 30583 1726853775.75707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853775.75840: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853775.75859: variable 'omit' from source: magic vars 30583 1726853775.75869: starting attempt loop 30583 1726853775.75878: running the handler 30583 1726853775.75936: handler run complete 30583 1726853775.75954: attempt loop complete, returning result 30583 1726853775.75961: _execute() done 30583 1726853775.75967: dumping result to json 30583 1726853775.75976: done dumping result, returning 30583 1726853775.76029: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-0000000021a4] 30583 1726853775.76032: sending task result for task 02083763-bbaf-05ea-abc5-0000000021a4 ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853775.76236: no more pending results, returning what we have 30583 1726853775.76246: results queue empty 30583 1726853775.76248: checking for any_errors_fatal 30583 1726853775.76258: done checking for any_errors_fatal 30583 1726853775.76259: checking for max_fail_percentage 30583 1726853775.76262: done checking for max_fail_percentage 30583 1726853775.76263: checking to see if all hosts have failed and the running result is not ok 30583 1726853775.76263: done checking to see if all hosts have failed 30583 1726853775.76264: getting the remaining hosts for this loop 30583 1726853775.76267: done getting the remaining hosts for this loop 30583 1726853775.76272: getting the next task for host managed_node2 30583 1726853775.76282: done getting next task for host managed_node2 30583 1726853775.76287: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853775.76292: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853775.76307: getting variables 30583 1726853775.76309: in VariableManager get_vars() 30583 1726853775.76491: Calling all_inventory to load vars for managed_node2 30583 1726853775.76495: Calling groups_inventory to load vars for managed_node2 30583 1726853775.76497: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853775.76582: Calling all_plugins_play to load vars for managed_node2 30583 1726853775.76585: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853775.76594: Calling groups_plugins_play to load vars for managed_node2 30583 1726853775.83959: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021a4 30583 1726853775.83963: WORKER PROCESS EXITING 30583 1726853775.84872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853775.86596: done with get_vars() 30583 1726853775.86634: done getting variables 30583 1726853775.86702: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:36:15 -0400 (0:00:00.132) 0:01:51.204 ****** 30583 1726853775.86747: entering _queue_task() for managed_node2/fail 30583 1726853775.87176: worker is 1 (out of 1 available) 30583 1726853775.87190: exiting _queue_task() for managed_node2/fail 30583 1726853775.87211: done queuing things up, now waiting for results queue to drain 30583 1726853775.87213: waiting for pending results... 30583 1726853775.87503: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853775.87679: in run() - task 02083763-bbaf-05ea-abc5-0000000021a5 30583 1726853775.87694: variable 'ansible_search_path' from source: unknown 30583 1726853775.87698: variable 'ansible_search_path' from source: unknown 30583 1726853775.87736: calling self._execute() 30583 1726853775.87839: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853775.87844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853775.87862: variable 'omit' from source: magic vars 30583 1726853775.88261: variable 'ansible_distribution_major_version' from source: facts 30583 1726853775.88300: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853775.88411: variable 'network_state' from source: role '' defaults 30583 1726853775.88476: Evaluated conditional (network_state != {}): False 30583 1726853775.88479: when evaluation is False, skipping this task 30583 1726853775.88482: _execute() done 30583 1726853775.88484: dumping result to json 30583 1726853775.88487: done dumping result, returning 30583 1726853775.88491: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-0000000021a5] 30583 1726853775.88493: sending task result for task 02083763-bbaf-05ea-abc5-0000000021a5 30583 1726853775.88569: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021a5 30583 1726853775.88574: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853775.88783: no more pending results, returning what we have 30583 1726853775.88786: results queue empty 30583 1726853775.88787: checking for any_errors_fatal 30583 1726853775.88793: done checking for any_errors_fatal 30583 1726853775.88794: checking for max_fail_percentage 30583 1726853775.88795: done checking for max_fail_percentage 30583 1726853775.88796: checking to see if all hosts have failed and the running result is not ok 30583 1726853775.88797: done checking to see if all hosts have failed 30583 1726853775.88798: getting the remaining hosts for this loop 30583 1726853775.88799: done getting the remaining hosts for this loop 30583 1726853775.88802: getting the next task for host managed_node2 30583 1726853775.88809: done getting next task for host managed_node2 30583 1726853775.88812: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853775.88817: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853775.88836: getting variables 30583 1726853775.88838: in VariableManager get_vars() 30583 1726853775.88881: Calling all_inventory to load vars for managed_node2 30583 1726853775.88884: Calling groups_inventory to load vars for managed_node2 30583 1726853775.88886: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853775.88895: Calling all_plugins_play to load vars for managed_node2 30583 1726853775.88898: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853775.88901: Calling groups_plugins_play to load vars for managed_node2 30583 1726853775.90308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853775.92119: done with get_vars() 30583 1726853775.92142: done getting variables 30583 1726853775.92204: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:36:15 -0400 (0:00:00.055) 0:01:51.259 ****** 30583 1726853775.92250: entering _queue_task() for managed_node2/fail 30583 1726853775.92632: worker is 1 (out of 1 available) 30583 1726853775.92646: exiting _queue_task() for managed_node2/fail 30583 1726853775.92776: done queuing things up, now waiting for results queue to drain 30583 1726853775.92778: waiting for pending results... 30583 1726853775.93096: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853775.93189: in run() - task 02083763-bbaf-05ea-abc5-0000000021a6 30583 1726853775.93193: variable 'ansible_search_path' from source: unknown 30583 1726853775.93200: variable 'ansible_search_path' from source: unknown 30583 1726853775.93233: calling self._execute() 30583 1726853775.93405: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853775.93409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853775.93416: variable 'omit' from source: magic vars 30583 1726853775.93799: variable 'ansible_distribution_major_version' from source: facts 30583 1726853775.93816: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853775.93969: variable 'network_state' from source: role '' defaults 30583 1726853775.93986: Evaluated conditional (network_state != {}): False 30583 1726853775.93993: when evaluation is False, skipping this task 30583 1726853775.94001: _execute() done 30583 1726853775.94008: dumping result to json 30583 1726853775.94015: done dumping result, returning 30583 1726853775.94026: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-0000000021a6] 30583 1726853775.94056: sending task result for task 02083763-bbaf-05ea-abc5-0000000021a6 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853775.94209: no more pending results, returning what we have 30583 1726853775.94213: results queue empty 30583 1726853775.94215: checking for any_errors_fatal 30583 1726853775.94223: done checking for any_errors_fatal 30583 1726853775.94224: checking for max_fail_percentage 30583 1726853775.94226: done checking for max_fail_percentage 30583 1726853775.94227: checking to see if all hosts have failed and the running result is not ok 30583 1726853775.94228: done checking to see if all hosts have failed 30583 1726853775.94228: getting the remaining hosts for this loop 30583 1726853775.94230: done getting the remaining hosts for this loop 30583 1726853775.94235: getting the next task for host managed_node2 30583 1726853775.94244: done getting next task for host managed_node2 30583 1726853775.94248: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853775.94255: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853775.94282: getting variables 30583 1726853775.94284: in VariableManager get_vars() 30583 1726853775.94332: Calling all_inventory to load vars for managed_node2 30583 1726853775.94334: Calling groups_inventory to load vars for managed_node2 30583 1726853775.94337: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853775.94348: Calling all_plugins_play to load vars for managed_node2 30583 1726853775.94351: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853775.94354: Calling groups_plugins_play to load vars for managed_node2 30583 1726853775.95087: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021a6 30583 1726853775.95090: WORKER PROCESS EXITING 30583 1726853775.96052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853775.97737: done with get_vars() 30583 1726853775.97768: done getting variables 30583 1726853775.97837: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:36:15 -0400 (0:00:00.056) 0:01:51.315 ****** 30583 1726853775.97875: entering _queue_task() for managed_node2/fail 30583 1726853775.98309: worker is 1 (out of 1 available) 30583 1726853775.98322: exiting _queue_task() for managed_node2/fail 30583 1726853775.98333: done queuing things up, now waiting for results queue to drain 30583 1726853775.98335: waiting for pending results... 30583 1726853775.98605: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853775.98775: in run() - task 02083763-bbaf-05ea-abc5-0000000021a7 30583 1726853775.98795: variable 'ansible_search_path' from source: unknown 30583 1726853775.98803: variable 'ansible_search_path' from source: unknown 30583 1726853775.98850: calling self._execute() 30583 1726853775.98965: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853775.98979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853775.98992: variable 'omit' from source: magic vars 30583 1726853775.99412: variable 'ansible_distribution_major_version' from source: facts 30583 1726853775.99477: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853775.99628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853776.02035: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853776.02501: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853776.02554: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853776.02594: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853776.02633: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853776.02734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.02876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.02880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.02883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.02885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.02950: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.02973: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853776.03097: variable 'ansible_distribution' from source: facts 30583 1726853776.03115: variable '__network_rh_distros' from source: role '' defaults 30583 1726853776.03130: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853776.03399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.03426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.03464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.03509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.03548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.03590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.03616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.03657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.03767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.03772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.03775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.03787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.03815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.03855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.03884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.04375: variable 'network_connections' from source: include params 30583 1726853776.04398: variable 'interface' from source: play vars 30583 1726853776.04463: variable 'interface' from source: play vars 30583 1726853776.04484: variable 'network_state' from source: role '' defaults 30583 1726853776.04556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853776.04737: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853776.04780: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853776.04833: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853776.04942: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853776.04946: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853776.04962: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853776.05006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.05038: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853776.05089: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853776.05098: when evaluation is False, skipping this task 30583 1726853776.05109: _execute() done 30583 1726853776.05115: dumping result to json 30583 1726853776.05123: done dumping result, returning 30583 1726853776.05160: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-0000000021a7] 30583 1726853776.05163: sending task result for task 02083763-bbaf-05ea-abc5-0000000021a7 skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853776.05429: no more pending results, returning what we have 30583 1726853776.05434: results queue empty 30583 1726853776.05436: checking for any_errors_fatal 30583 1726853776.05442: done checking for any_errors_fatal 30583 1726853776.05442: checking for max_fail_percentage 30583 1726853776.05445: done checking for max_fail_percentage 30583 1726853776.05446: checking to see if all hosts have failed and the running result is not ok 30583 1726853776.05447: done checking to see if all hosts have failed 30583 1726853776.05448: getting the remaining hosts for this loop 30583 1726853776.05450: done getting the remaining hosts for this loop 30583 1726853776.05454: getting the next task for host managed_node2 30583 1726853776.05463: done getting next task for host managed_node2 30583 1726853776.05468: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853776.05475: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853776.05503: getting variables 30583 1726853776.05506: in VariableManager get_vars() 30583 1726853776.05558: Calling all_inventory to load vars for managed_node2 30583 1726853776.05562: Calling groups_inventory to load vars for managed_node2 30583 1726853776.05565: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853776.05705: Calling all_plugins_play to load vars for managed_node2 30583 1726853776.05710: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853776.05714: Calling groups_plugins_play to load vars for managed_node2 30583 1726853776.06321: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021a7 30583 1726853776.06325: WORKER PROCESS EXITING 30583 1726853776.07599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853776.09096: done with get_vars() 30583 1726853776.09129: done getting variables 30583 1726853776.09194: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:36:16 -0400 (0:00:00.113) 0:01:51.429 ****** 30583 1726853776.09233: entering _queue_task() for managed_node2/dnf 30583 1726853776.09693: worker is 1 (out of 1 available) 30583 1726853776.09709: exiting _queue_task() for managed_node2/dnf 30583 1726853776.09719: done queuing things up, now waiting for results queue to drain 30583 1726853776.09720: waiting for pending results... 30583 1726853776.09948: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853776.10108: in run() - task 02083763-bbaf-05ea-abc5-0000000021a8 30583 1726853776.10127: variable 'ansible_search_path' from source: unknown 30583 1726853776.10135: variable 'ansible_search_path' from source: unknown 30583 1726853776.10184: calling self._execute() 30583 1726853776.10297: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853776.10310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853776.10325: variable 'omit' from source: magic vars 30583 1726853776.10744: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.10761: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853776.10976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853776.12915: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853776.12974: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853776.13003: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853776.13027: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853776.13048: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853776.13109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.13129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.13146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.13176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.13189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.13267: variable 'ansible_distribution' from source: facts 30583 1726853776.13274: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.13287: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853776.13364: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853776.13448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.13465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.13483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.13512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.13521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.13548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.13564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.13582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.13607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.13619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.13646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.13664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.13680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.13704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.13715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.13820: variable 'network_connections' from source: include params 30583 1726853776.13831: variable 'interface' from source: play vars 30583 1726853776.13877: variable 'interface' from source: play vars 30583 1726853776.13939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853776.14105: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853776.14120: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853776.14176: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853776.14179: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853776.14217: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853776.14241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853776.14376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.14380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853776.14383: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853776.14572: variable 'network_connections' from source: include params 30583 1726853776.14577: variable 'interface' from source: play vars 30583 1726853776.14632: variable 'interface' from source: play vars 30583 1726853776.14663: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853776.14667: when evaluation is False, skipping this task 30583 1726853776.14670: _execute() done 30583 1726853776.14674: dumping result to json 30583 1726853776.14677: done dumping result, returning 30583 1726853776.14717: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000021a8] 30583 1726853776.14722: sending task result for task 02083763-bbaf-05ea-abc5-0000000021a8 30583 1726853776.14790: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021a8 30583 1726853776.14793: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853776.14875: no more pending results, returning what we have 30583 1726853776.14879: results queue empty 30583 1726853776.14881: checking for any_errors_fatal 30583 1726853776.14887: done checking for any_errors_fatal 30583 1726853776.14888: checking for max_fail_percentage 30583 1726853776.14889: done checking for max_fail_percentage 30583 1726853776.14890: checking to see if all hosts have failed and the running result is not ok 30583 1726853776.14891: done checking to see if all hosts have failed 30583 1726853776.14892: getting the remaining hosts for this loop 30583 1726853776.14894: done getting the remaining hosts for this loop 30583 1726853776.14897: getting the next task for host managed_node2 30583 1726853776.14906: done getting next task for host managed_node2 30583 1726853776.14909: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853776.14914: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853776.14935: getting variables 30583 1726853776.14936: in VariableManager get_vars() 30583 1726853776.14983: Calling all_inventory to load vars for managed_node2 30583 1726853776.14986: Calling groups_inventory to load vars for managed_node2 30583 1726853776.14988: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853776.14997: Calling all_plugins_play to load vars for managed_node2 30583 1726853776.15000: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853776.15002: Calling groups_plugins_play to load vars for managed_node2 30583 1726853776.15943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853776.16821: done with get_vars() 30583 1726853776.16837: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853776.16892: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:36:16 -0400 (0:00:00.076) 0:01:51.506 ****** 30583 1726853776.16918: entering _queue_task() for managed_node2/yum 30583 1726853776.17159: worker is 1 (out of 1 available) 30583 1726853776.17175: exiting _queue_task() for managed_node2/yum 30583 1726853776.17187: done queuing things up, now waiting for results queue to drain 30583 1726853776.17188: waiting for pending results... 30583 1726853776.17688: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853776.17694: in run() - task 02083763-bbaf-05ea-abc5-0000000021a9 30583 1726853776.17697: variable 'ansible_search_path' from source: unknown 30583 1726853776.17700: variable 'ansible_search_path' from source: unknown 30583 1726853776.17702: calling self._execute() 30583 1726853776.17728: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853776.17740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853776.17753: variable 'omit' from source: magic vars 30583 1726853776.18146: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.18167: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853776.18352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853776.20936: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853776.21011: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853776.21069: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853776.21110: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853776.21139: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853776.21226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.21263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.21305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.21351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.21377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.21490: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.21511: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853776.21520: when evaluation is False, skipping this task 30583 1726853776.21527: _execute() done 30583 1726853776.21534: dumping result to json 30583 1726853776.21542: done dumping result, returning 30583 1726853776.21555: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000021a9] 30583 1726853776.21568: sending task result for task 02083763-bbaf-05ea-abc5-0000000021a9 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853776.21737: no more pending results, returning what we have 30583 1726853776.21741: results queue empty 30583 1726853776.21742: checking for any_errors_fatal 30583 1726853776.21747: done checking for any_errors_fatal 30583 1726853776.21748: checking for max_fail_percentage 30583 1726853776.21750: done checking for max_fail_percentage 30583 1726853776.21751: checking to see if all hosts have failed and the running result is not ok 30583 1726853776.21751: done checking to see if all hosts have failed 30583 1726853776.21752: getting the remaining hosts for this loop 30583 1726853776.21754: done getting the remaining hosts for this loop 30583 1726853776.21757: getting the next task for host managed_node2 30583 1726853776.21765: done getting next task for host managed_node2 30583 1726853776.21769: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853776.21776: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853776.21801: getting variables 30583 1726853776.21802: in VariableManager get_vars() 30583 1726853776.21848: Calling all_inventory to load vars for managed_node2 30583 1726853776.21851: Calling groups_inventory to load vars for managed_node2 30583 1726853776.21853: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853776.21864: Calling all_plugins_play to load vars for managed_node2 30583 1726853776.21866: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853776.21870: Calling groups_plugins_play to load vars for managed_node2 30583 1726853776.22601: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021a9 30583 1726853776.22604: WORKER PROCESS EXITING 30583 1726853776.23632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853776.25209: done with get_vars() 30583 1726853776.25235: done getting variables 30583 1726853776.25302: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:36:16 -0400 (0:00:00.084) 0:01:51.590 ****** 30583 1726853776.25339: entering _queue_task() for managed_node2/fail 30583 1726853776.25719: worker is 1 (out of 1 available) 30583 1726853776.25732: exiting _queue_task() for managed_node2/fail 30583 1726853776.25745: done queuing things up, now waiting for results queue to drain 30583 1726853776.25746: waiting for pending results... 30583 1726853776.26103: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853776.26378: in run() - task 02083763-bbaf-05ea-abc5-0000000021aa 30583 1726853776.26382: variable 'ansible_search_path' from source: unknown 30583 1726853776.26386: variable 'ansible_search_path' from source: unknown 30583 1726853776.26389: calling self._execute() 30583 1726853776.26423: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853776.26435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853776.26450: variable 'omit' from source: magic vars 30583 1726853776.26874: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.26892: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853776.27027: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853776.27233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853776.29585: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853776.29674: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853776.29716: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853776.29753: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853776.29976: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853776.29980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.29983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.29985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.29987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.30002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.30052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.30084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.30117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.30163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.30187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.30235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.30265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.30296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.30341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.30362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.30550: variable 'network_connections' from source: include params 30583 1726853776.30574: variable 'interface' from source: play vars 30583 1726853776.30654: variable 'interface' from source: play vars 30583 1726853776.30731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853776.30908: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853776.30962: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853776.31005: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853776.31076: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853776.31096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853776.31123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853776.31151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.31189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853776.31266: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853776.31440: variable 'network_connections' from source: include params 30583 1726853776.31443: variable 'interface' from source: play vars 30583 1726853776.31488: variable 'interface' from source: play vars 30583 1726853776.31513: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853776.31516: when evaluation is False, skipping this task 30583 1726853776.31519: _execute() done 30583 1726853776.31523: dumping result to json 30583 1726853776.31526: done dumping result, returning 30583 1726853776.31533: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000021aa] 30583 1726853776.31535: sending task result for task 02083763-bbaf-05ea-abc5-0000000021aa 30583 1726853776.31628: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021aa 30583 1726853776.31631: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853776.31688: no more pending results, returning what we have 30583 1726853776.31692: results queue empty 30583 1726853776.31693: checking for any_errors_fatal 30583 1726853776.31699: done checking for any_errors_fatal 30583 1726853776.31700: checking for max_fail_percentage 30583 1726853776.31702: done checking for max_fail_percentage 30583 1726853776.31703: checking to see if all hosts have failed and the running result is not ok 30583 1726853776.31703: done checking to see if all hosts have failed 30583 1726853776.31704: getting the remaining hosts for this loop 30583 1726853776.31706: done getting the remaining hosts for this loop 30583 1726853776.31709: getting the next task for host managed_node2 30583 1726853776.31717: done getting next task for host managed_node2 30583 1726853776.31721: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853776.31726: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853776.31746: getting variables 30583 1726853776.31747: in VariableManager get_vars() 30583 1726853776.31797: Calling all_inventory to load vars for managed_node2 30583 1726853776.31800: Calling groups_inventory to load vars for managed_node2 30583 1726853776.31802: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853776.31811: Calling all_plugins_play to load vars for managed_node2 30583 1726853776.31813: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853776.31815: Calling groups_plugins_play to load vars for managed_node2 30583 1726853776.32652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853776.34062: done with get_vars() 30583 1726853776.34090: done getting variables 30583 1726853776.34140: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:36:16 -0400 (0:00:00.088) 0:01:51.678 ****** 30583 1726853776.34170: entering _queue_task() for managed_node2/package 30583 1726853776.34452: worker is 1 (out of 1 available) 30583 1726853776.34466: exiting _queue_task() for managed_node2/package 30583 1726853776.34480: done queuing things up, now waiting for results queue to drain 30583 1726853776.34482: waiting for pending results... 30583 1726853776.34687: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853776.34792: in run() - task 02083763-bbaf-05ea-abc5-0000000021ab 30583 1726853776.34801: variable 'ansible_search_path' from source: unknown 30583 1726853776.34805: variable 'ansible_search_path' from source: unknown 30583 1726853776.34836: calling self._execute() 30583 1726853776.34914: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853776.34920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853776.34927: variable 'omit' from source: magic vars 30583 1726853776.35217: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.35227: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853776.35365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853776.35564: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853776.35599: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853776.35623: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853776.35685: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853776.35763: variable 'network_packages' from source: role '' defaults 30583 1726853776.35836: variable '__network_provider_setup' from source: role '' defaults 30583 1726853776.35843: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853776.35892: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853776.35900: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853776.35943: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853776.36061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853776.37390: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853776.37434: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853776.37464: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853776.37486: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853776.37515: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853776.37589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.37608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.37631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.37655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.37666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.37699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.37714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.37731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.37760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.37769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.37976: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853776.38086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.38276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.38280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.38282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.38284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.38302: variable 'ansible_python' from source: facts 30583 1726853776.38324: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853776.38420: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853776.38517: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853776.38650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.38684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.38712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.38764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.38786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.38842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.38886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.38914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.38978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.39006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.39101: variable 'network_connections' from source: include params 30583 1726853776.39105: variable 'interface' from source: play vars 30583 1726853776.39188: variable 'interface' from source: play vars 30583 1726853776.39240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853776.39261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853776.39285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.39306: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853776.39343: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853776.39525: variable 'network_connections' from source: include params 30583 1726853776.39528: variable 'interface' from source: play vars 30583 1726853776.39603: variable 'interface' from source: play vars 30583 1726853776.39637: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853776.39692: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853776.39889: variable 'network_connections' from source: include params 30583 1726853776.39893: variable 'interface' from source: play vars 30583 1726853776.39940: variable 'interface' from source: play vars 30583 1726853776.39960: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853776.40010: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853776.40207: variable 'network_connections' from source: include params 30583 1726853776.40210: variable 'interface' from source: play vars 30583 1726853776.40260: variable 'interface' from source: play vars 30583 1726853776.40303: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853776.40344: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853776.40350: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853776.40395: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853776.40528: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853776.40976: variable 'network_connections' from source: include params 30583 1726853776.40981: variable 'interface' from source: play vars 30583 1726853776.41006: variable 'interface' from source: play vars 30583 1726853776.41015: variable 'ansible_distribution' from source: facts 30583 1726853776.41018: variable '__network_rh_distros' from source: role '' defaults 30583 1726853776.41024: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.41053: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853776.41220: variable 'ansible_distribution' from source: facts 30583 1726853776.41223: variable '__network_rh_distros' from source: role '' defaults 30583 1726853776.41228: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.41237: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853776.41399: variable 'ansible_distribution' from source: facts 30583 1726853776.41403: variable '__network_rh_distros' from source: role '' defaults 30583 1726853776.41407: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.41440: variable 'network_provider' from source: set_fact 30583 1726853776.41454: variable 'ansible_facts' from source: unknown 30583 1726853776.42176: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853776.42180: when evaluation is False, skipping this task 30583 1726853776.42182: _execute() done 30583 1726853776.42184: dumping result to json 30583 1726853776.42186: done dumping result, returning 30583 1726853776.42189: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-0000000021ab] 30583 1726853776.42190: sending task result for task 02083763-bbaf-05ea-abc5-0000000021ab skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853776.42425: no more pending results, returning what we have 30583 1726853776.42429: results queue empty 30583 1726853776.42430: checking for any_errors_fatal 30583 1726853776.42437: done checking for any_errors_fatal 30583 1726853776.42437: checking for max_fail_percentage 30583 1726853776.42439: done checking for max_fail_percentage 30583 1726853776.42440: checking to see if all hosts have failed and the running result is not ok 30583 1726853776.42441: done checking to see if all hosts have failed 30583 1726853776.42441: getting the remaining hosts for this loop 30583 1726853776.42443: done getting the remaining hosts for this loop 30583 1726853776.42447: getting the next task for host managed_node2 30583 1726853776.42454: done getting next task for host managed_node2 30583 1726853776.42458: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853776.42463: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853776.42482: getting variables 30583 1726853776.42483: in VariableManager get_vars() 30583 1726853776.42525: Calling all_inventory to load vars for managed_node2 30583 1726853776.42527: Calling groups_inventory to load vars for managed_node2 30583 1726853776.42534: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853776.42541: Calling all_plugins_play to load vars for managed_node2 30583 1726853776.42544: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853776.42546: Calling groups_plugins_play to load vars for managed_node2 30583 1726853776.43467: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021ab 30583 1726853776.43472: WORKER PROCESS EXITING 30583 1726853776.43478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853776.44349: done with get_vars() 30583 1726853776.44366: done getting variables 30583 1726853776.44416: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:36:16 -0400 (0:00:00.102) 0:01:51.781 ****** 30583 1726853776.44466: entering _queue_task() for managed_node2/package 30583 1726853776.44840: worker is 1 (out of 1 available) 30583 1726853776.44852: exiting _queue_task() for managed_node2/package 30583 1726853776.44866: done queuing things up, now waiting for results queue to drain 30583 1726853776.44867: waiting for pending results... 30583 1726853776.45203: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853776.45355: in run() - task 02083763-bbaf-05ea-abc5-0000000021ac 30583 1726853776.45367: variable 'ansible_search_path' from source: unknown 30583 1726853776.45374: variable 'ansible_search_path' from source: unknown 30583 1726853776.45407: calling self._execute() 30583 1726853776.45488: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853776.45492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853776.45504: variable 'omit' from source: magic vars 30583 1726853776.45809: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.45818: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853776.45905: variable 'network_state' from source: role '' defaults 30583 1726853776.45913: Evaluated conditional (network_state != {}): False 30583 1726853776.45916: when evaluation is False, skipping this task 30583 1726853776.45919: _execute() done 30583 1726853776.45921: dumping result to json 30583 1726853776.45924: done dumping result, returning 30583 1726853776.45932: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-0000000021ac] 30583 1726853776.45936: sending task result for task 02083763-bbaf-05ea-abc5-0000000021ac 30583 1726853776.46026: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021ac 30583 1726853776.46029: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853776.46105: no more pending results, returning what we have 30583 1726853776.46109: results queue empty 30583 1726853776.46111: checking for any_errors_fatal 30583 1726853776.46117: done checking for any_errors_fatal 30583 1726853776.46117: checking for max_fail_percentage 30583 1726853776.46120: done checking for max_fail_percentage 30583 1726853776.46121: checking to see if all hosts have failed and the running result is not ok 30583 1726853776.46121: done checking to see if all hosts have failed 30583 1726853776.46122: getting the remaining hosts for this loop 30583 1726853776.46124: done getting the remaining hosts for this loop 30583 1726853776.46127: getting the next task for host managed_node2 30583 1726853776.46134: done getting next task for host managed_node2 30583 1726853776.46138: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853776.46142: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853776.46160: getting variables 30583 1726853776.46164: in VariableManager get_vars() 30583 1726853776.46202: Calling all_inventory to load vars for managed_node2 30583 1726853776.46205: Calling groups_inventory to load vars for managed_node2 30583 1726853776.46207: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853776.46214: Calling all_plugins_play to load vars for managed_node2 30583 1726853776.46217: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853776.46219: Calling groups_plugins_play to load vars for managed_node2 30583 1726853776.47150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853776.48011: done with get_vars() 30583 1726853776.48026: done getting variables 30583 1726853776.48069: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:36:16 -0400 (0:00:00.036) 0:01:51.818 ****** 30583 1726853776.48096: entering _queue_task() for managed_node2/package 30583 1726853776.48333: worker is 1 (out of 1 available) 30583 1726853776.48347: exiting _queue_task() for managed_node2/package 30583 1726853776.48362: done queuing things up, now waiting for results queue to drain 30583 1726853776.48363: waiting for pending results... 30583 1726853776.48560: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853776.48666: in run() - task 02083763-bbaf-05ea-abc5-0000000021ad 30583 1726853776.48678: variable 'ansible_search_path' from source: unknown 30583 1726853776.48681: variable 'ansible_search_path' from source: unknown 30583 1726853776.48712: calling self._execute() 30583 1726853776.48790: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853776.48794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853776.48807: variable 'omit' from source: magic vars 30583 1726853776.49090: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.49100: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853776.49187: variable 'network_state' from source: role '' defaults 30583 1726853776.49195: Evaluated conditional (network_state != {}): False 30583 1726853776.49199: when evaluation is False, skipping this task 30583 1726853776.49202: _execute() done 30583 1726853776.49205: dumping result to json 30583 1726853776.49207: done dumping result, returning 30583 1726853776.49215: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-0000000021ad] 30583 1726853776.49217: sending task result for task 02083763-bbaf-05ea-abc5-0000000021ad 30583 1726853776.49311: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021ad 30583 1726853776.49313: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853776.49390: no more pending results, returning what we have 30583 1726853776.49394: results queue empty 30583 1726853776.49395: checking for any_errors_fatal 30583 1726853776.49402: done checking for any_errors_fatal 30583 1726853776.49402: checking for max_fail_percentage 30583 1726853776.49404: done checking for max_fail_percentage 30583 1726853776.49405: checking to see if all hosts have failed and the running result is not ok 30583 1726853776.49405: done checking to see if all hosts have failed 30583 1726853776.49406: getting the remaining hosts for this loop 30583 1726853776.49408: done getting the remaining hosts for this loop 30583 1726853776.49411: getting the next task for host managed_node2 30583 1726853776.49418: done getting next task for host managed_node2 30583 1726853776.49423: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853776.49427: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853776.49445: getting variables 30583 1726853776.49447: in VariableManager get_vars() 30583 1726853776.49486: Calling all_inventory to load vars for managed_node2 30583 1726853776.49489: Calling groups_inventory to load vars for managed_node2 30583 1726853776.49491: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853776.49498: Calling all_plugins_play to load vars for managed_node2 30583 1726853776.49501: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853776.49503: Calling groups_plugins_play to load vars for managed_node2 30583 1726853776.50278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853776.51160: done with get_vars() 30583 1726853776.51177: done getting variables 30583 1726853776.51218: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:36:16 -0400 (0:00:00.031) 0:01:51.849 ****** 30583 1726853776.51242: entering _queue_task() for managed_node2/service 30583 1726853776.51477: worker is 1 (out of 1 available) 30583 1726853776.51492: exiting _queue_task() for managed_node2/service 30583 1726853776.51506: done queuing things up, now waiting for results queue to drain 30583 1726853776.51507: waiting for pending results... 30583 1726853776.51692: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853776.51801: in run() - task 02083763-bbaf-05ea-abc5-0000000021ae 30583 1726853776.51811: variable 'ansible_search_path' from source: unknown 30583 1726853776.51814: variable 'ansible_search_path' from source: unknown 30583 1726853776.51846: calling self._execute() 30583 1726853776.51922: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853776.51927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853776.51936: variable 'omit' from source: magic vars 30583 1726853776.52215: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.52224: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853776.52310: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853776.52439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853776.54193: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853776.54241: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853776.54276: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853776.54301: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853776.54321: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853776.54385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.54405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.54422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.54446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.54463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.54496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.54511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.54528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.54552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.54567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.54594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.54609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.54625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.54650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.54662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.54773: variable 'network_connections' from source: include params 30583 1726853776.54784: variable 'interface' from source: play vars 30583 1726853776.54831: variable 'interface' from source: play vars 30583 1726853776.54886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853776.54991: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853776.55021: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853776.55044: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853776.55075: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853776.55105: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853776.55123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853776.55140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.55159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853776.55204: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853776.55352: variable 'network_connections' from source: include params 30583 1726853776.55355: variable 'interface' from source: play vars 30583 1726853776.55400: variable 'interface' from source: play vars 30583 1726853776.55423: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853776.55426: when evaluation is False, skipping this task 30583 1726853776.55429: _execute() done 30583 1726853776.55431: dumping result to json 30583 1726853776.55436: done dumping result, returning 30583 1726853776.55446: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000021ae] 30583 1726853776.55449: sending task result for task 02083763-bbaf-05ea-abc5-0000000021ae 30583 1726853776.55533: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021ae 30583 1726853776.55543: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853776.55595: no more pending results, returning what we have 30583 1726853776.55598: results queue empty 30583 1726853776.55599: checking for any_errors_fatal 30583 1726853776.55606: done checking for any_errors_fatal 30583 1726853776.55607: checking for max_fail_percentage 30583 1726853776.55609: done checking for max_fail_percentage 30583 1726853776.55610: checking to see if all hosts have failed and the running result is not ok 30583 1726853776.55610: done checking to see if all hosts have failed 30583 1726853776.55611: getting the remaining hosts for this loop 30583 1726853776.55613: done getting the remaining hosts for this loop 30583 1726853776.55616: getting the next task for host managed_node2 30583 1726853776.55624: done getting next task for host managed_node2 30583 1726853776.55628: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853776.55633: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853776.55652: getting variables 30583 1726853776.55653: in VariableManager get_vars() 30583 1726853776.55704: Calling all_inventory to load vars for managed_node2 30583 1726853776.55706: Calling groups_inventory to load vars for managed_node2 30583 1726853776.55709: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853776.55717: Calling all_plugins_play to load vars for managed_node2 30583 1726853776.55719: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853776.55722: Calling groups_plugins_play to load vars for managed_node2 30583 1726853776.56693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853776.57562: done with get_vars() 30583 1726853776.57582: done getting variables 30583 1726853776.57626: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:36:16 -0400 (0:00:00.064) 0:01:51.913 ****** 30583 1726853776.57650: entering _queue_task() for managed_node2/service 30583 1726853776.57916: worker is 1 (out of 1 available) 30583 1726853776.57931: exiting _queue_task() for managed_node2/service 30583 1726853776.57943: done queuing things up, now waiting for results queue to drain 30583 1726853776.57945: waiting for pending results... 30583 1726853776.58139: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853776.58241: in run() - task 02083763-bbaf-05ea-abc5-0000000021af 30583 1726853776.58251: variable 'ansible_search_path' from source: unknown 30583 1726853776.58255: variable 'ansible_search_path' from source: unknown 30583 1726853776.58289: calling self._execute() 30583 1726853776.58365: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853776.58369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853776.58379: variable 'omit' from source: magic vars 30583 1726853776.58663: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.58670: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853776.58784: variable 'network_provider' from source: set_fact 30583 1726853776.58788: variable 'network_state' from source: role '' defaults 30583 1726853776.58797: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853776.58803: variable 'omit' from source: magic vars 30583 1726853776.58848: variable 'omit' from source: magic vars 30583 1726853776.58868: variable 'network_service_name' from source: role '' defaults 30583 1726853776.58915: variable 'network_service_name' from source: role '' defaults 30583 1726853776.58988: variable '__network_provider_setup' from source: role '' defaults 30583 1726853776.58992: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853776.59036: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853776.59049: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853776.59092: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853776.59246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853776.60705: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853776.60756: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853776.60789: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853776.60816: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853776.60835: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853776.60965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.60969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.60974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.61019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.61022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.61376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.61380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.61383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.61385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.61387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.61628: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853776.61765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.61799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.61840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.61891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.61912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.62022: variable 'ansible_python' from source: facts 30583 1726853776.62052: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853776.62147: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853776.62241: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853776.62381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.62402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.62419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.62442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.62452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.62493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853776.62512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853776.62528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.62552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853776.62565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853776.62661: variable 'network_connections' from source: include params 30583 1726853776.62669: variable 'interface' from source: play vars 30583 1726853776.62724: variable 'interface' from source: play vars 30583 1726853776.62798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853776.62934: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853776.63064: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853776.63067: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853776.63070: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853776.63176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853776.63182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853776.63184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853776.63208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853776.63252: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853776.63527: variable 'network_connections' from source: include params 30583 1726853776.63533: variable 'interface' from source: play vars 30583 1726853776.63610: variable 'interface' from source: play vars 30583 1726853776.63676: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853776.63736: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853776.64009: variable 'network_connections' from source: include params 30583 1726853776.64012: variable 'interface' from source: play vars 30583 1726853776.64113: variable 'interface' from source: play vars 30583 1726853776.64117: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853776.64185: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853776.64389: variable 'network_connections' from source: include params 30583 1726853776.64392: variable 'interface' from source: play vars 30583 1726853776.64444: variable 'interface' from source: play vars 30583 1726853776.64490: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853776.64534: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853776.64537: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853776.64583: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853776.64715: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853776.65035: variable 'network_connections' from source: include params 30583 1726853776.65039: variable 'interface' from source: play vars 30583 1726853776.65085: variable 'interface' from source: play vars 30583 1726853776.65093: variable 'ansible_distribution' from source: facts 30583 1726853776.65096: variable '__network_rh_distros' from source: role '' defaults 30583 1726853776.65103: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.65125: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853776.65236: variable 'ansible_distribution' from source: facts 30583 1726853776.65239: variable '__network_rh_distros' from source: role '' defaults 30583 1726853776.65244: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.65252: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853776.65365: variable 'ansible_distribution' from source: facts 30583 1726853776.65368: variable '__network_rh_distros' from source: role '' defaults 30583 1726853776.65373: variable 'ansible_distribution_major_version' from source: facts 30583 1726853776.65400: variable 'network_provider' from source: set_fact 30583 1726853776.65417: variable 'omit' from source: magic vars 30583 1726853776.65438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853776.65458: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853776.65477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853776.65492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853776.65501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853776.65524: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853776.65527: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853776.65530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853776.65604: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853776.65609: Set connection var ansible_timeout to 10 30583 1726853776.65612: Set connection var ansible_connection to ssh 30583 1726853776.65617: Set connection var ansible_shell_executable to /bin/sh 30583 1726853776.65620: Set connection var ansible_shell_type to sh 30583 1726853776.65628: Set connection var ansible_pipelining to False 30583 1726853776.65647: variable 'ansible_shell_executable' from source: unknown 30583 1726853776.65650: variable 'ansible_connection' from source: unknown 30583 1726853776.65652: variable 'ansible_module_compression' from source: unknown 30583 1726853776.65655: variable 'ansible_shell_type' from source: unknown 30583 1726853776.65657: variable 'ansible_shell_executable' from source: unknown 30583 1726853776.65662: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853776.65666: variable 'ansible_pipelining' from source: unknown 30583 1726853776.65668: variable 'ansible_timeout' from source: unknown 30583 1726853776.65674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853776.65747: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853776.65755: variable 'omit' from source: magic vars 30583 1726853776.65763: starting attempt loop 30583 1726853776.65766: running the handler 30583 1726853776.65841: variable 'ansible_facts' from source: unknown 30583 1726853776.66683: _low_level_execute_command(): starting 30583 1726853776.66687: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853776.67168: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853776.67187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853776.67205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853776.67243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853776.67255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853776.67340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853776.69060: stdout chunk (state=3): >>>/root <<< 30583 1726853776.69276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853776.69279: stderr chunk (state=3): >>><<< 30583 1726853776.69281: stdout chunk (state=3): >>><<< 30583 1726853776.69284: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853776.69288: _low_level_execute_command(): starting 30583 1726853776.69290: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821 `" && echo ansible-tmp-1726853776.692398-35742-242941806636821="` echo /root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821 `" ) && sleep 0' 30583 1726853776.69873: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853776.69884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853776.69896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853776.69910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853776.69932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853776.69939: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853776.69949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853776.69967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853776.69981: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853776.69989: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853776.69997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853776.70007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853776.70152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853776.70156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853776.70161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853776.70210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853776.72262: stdout chunk (state=3): >>>ansible-tmp-1726853776.692398-35742-242941806636821=/root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821 <<< 30583 1726853776.72418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853776.72422: stdout chunk (state=3): >>><<< 30583 1726853776.72424: stderr chunk (state=3): >>><<< 30583 1726853776.72442: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853776.692398-35742-242941806636821=/root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853776.72576: variable 'ansible_module_compression' from source: unknown 30583 1726853776.72581: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853776.72626: variable 'ansible_facts' from source: unknown 30583 1726853776.72870: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821/AnsiballZ_systemd.py 30583 1726853776.73045: Sending initial data 30583 1726853776.73053: Sent initial data (155 bytes) 30583 1726853776.73630: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853776.73767: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853776.73772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853776.73840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853776.75536: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853776.75611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853776.75695: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpy0r5sk1l /root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821/AnsiballZ_systemd.py <<< 30583 1726853776.75708: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821/AnsiballZ_systemd.py" <<< 30583 1726853776.75775: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpy0r5sk1l" to remote "/root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821/AnsiballZ_systemd.py" <<< 30583 1726853776.77528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853776.77566: stderr chunk (state=3): >>><<< 30583 1726853776.77581: stdout chunk (state=3): >>><<< 30583 1726853776.77676: done transferring module to remote 30583 1726853776.77680: _low_level_execute_command(): starting 30583 1726853776.77682: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821/ /root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821/AnsiballZ_systemd.py && sleep 0' 30583 1726853776.78322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853776.78337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853776.78393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853776.78404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853776.78496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853776.78518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853776.78623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853776.80529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853776.80565: stderr chunk (state=3): >>><<< 30583 1726853776.80569: stdout chunk (state=3): >>><<< 30583 1726853776.80584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853776.80587: _low_level_execute_command(): starting 30583 1726853776.80592: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821/AnsiballZ_systemd.py && sleep 0' 30583 1726853776.81028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853776.81032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853776.81035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853776.81037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853776.81088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853776.81098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853776.81174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853777.10967: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4608000", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300442112", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2006174000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 30583 1726853777.10994: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "syst<<< 30583 1726853777.11009: stdout chunk (state=3): >>>em.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853777.13378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853777.13383: stderr chunk (state=3): >>><<< 30583 1726853777.13385: stdout chunk (state=3): >>><<< 30583 1726853777.13389: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4608000", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300442112", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2006174000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853777.13448: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853777.13470: _low_level_execute_command(): starting 30583 1726853777.13475: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853776.692398-35742-242941806636821/ > /dev/null 2>&1 && sleep 0' 30583 1726853777.14088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853777.14097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853777.14108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853777.14132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853777.14144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853777.14151: stderr chunk (state=3): >>>debug2: match not found <<< 30583 1726853777.14162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.14179: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853777.14187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853777.14193: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30583 1726853777.14201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853777.14238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.14293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853777.14316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853777.14325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853777.14432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853777.16478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853777.16482: stdout chunk (state=3): >>><<< 30583 1726853777.16484: stderr chunk (state=3): >>><<< 30583 1726853777.16487: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853777.16489: handler run complete 30583 1726853777.16505: attempt loop complete, returning result 30583 1726853777.16508: _execute() done 30583 1726853777.16510: dumping result to json 30583 1726853777.16539: done dumping result, returning 30583 1726853777.16550: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-0000000021af] 30583 1726853777.16555: sending task result for task 02083763-bbaf-05ea-abc5-0000000021af ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853777.17106: no more pending results, returning what we have 30583 1726853777.17110: results queue empty 30583 1726853777.17112: checking for any_errors_fatal 30583 1726853777.17118: done checking for any_errors_fatal 30583 1726853777.17119: checking for max_fail_percentage 30583 1726853777.17121: done checking for max_fail_percentage 30583 1726853777.17122: checking to see if all hosts have failed and the running result is not ok 30583 1726853777.17123: done checking to see if all hosts have failed 30583 1726853777.17123: getting the remaining hosts for this loop 30583 1726853777.17125: done getting the remaining hosts for this loop 30583 1726853777.17128: getting the next task for host managed_node2 30583 1726853777.17138: done getting next task for host managed_node2 30583 1726853777.17142: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853777.17147: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853777.17164: getting variables 30583 1726853777.17166: in VariableManager get_vars() 30583 1726853777.17335: Calling all_inventory to load vars for managed_node2 30583 1726853777.17339: Calling groups_inventory to load vars for managed_node2 30583 1726853777.17341: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853777.17353: Calling all_plugins_play to load vars for managed_node2 30583 1726853777.17356: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853777.17363: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021af 30583 1726853777.17366: WORKER PROCESS EXITING 30583 1726853777.17434: Calling groups_plugins_play to load vars for managed_node2 30583 1726853777.19126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853777.21042: done with get_vars() 30583 1726853777.21069: done getting variables 30583 1726853777.21141: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:36:17 -0400 (0:00:00.635) 0:01:52.549 ****** 30583 1726853777.21181: entering _queue_task() for managed_node2/service 30583 1726853777.21796: worker is 1 (out of 1 available) 30583 1726853777.21807: exiting _queue_task() for managed_node2/service 30583 1726853777.21821: done queuing things up, now waiting for results queue to drain 30583 1726853777.21822: waiting for pending results... 30583 1726853777.21946: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853777.22123: in run() - task 02083763-bbaf-05ea-abc5-0000000021b0 30583 1726853777.22135: variable 'ansible_search_path' from source: unknown 30583 1726853777.22139: variable 'ansible_search_path' from source: unknown 30583 1726853777.22189: calling self._execute() 30583 1726853777.22298: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853777.22302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853777.22313: variable 'omit' from source: magic vars 30583 1726853777.22746: variable 'ansible_distribution_major_version' from source: facts 30583 1726853777.22758: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853777.22879: variable 'network_provider' from source: set_fact 30583 1726853777.22884: Evaluated conditional (network_provider == "nm"): True 30583 1726853777.22978: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853777.23069: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853777.23234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853777.25514: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853777.25676: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853777.25681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853777.25683: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853777.25696: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853777.25798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853777.25826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853777.25851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853777.25903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853777.25917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853777.25961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853777.25996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853777.26020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853777.26057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853777.26075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853777.26122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853777.26276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853777.26279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853777.26282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853777.26284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853777.26376: variable 'network_connections' from source: include params 30583 1726853777.26389: variable 'interface' from source: play vars 30583 1726853777.26466: variable 'interface' from source: play vars 30583 1726853777.26553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853777.26727: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853777.26775: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853777.26805: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853777.26832: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853777.26885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853777.26905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853777.26929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853777.26954: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853777.27015: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853777.27287: variable 'network_connections' from source: include params 30583 1726853777.27294: variable 'interface' from source: play vars 30583 1726853777.27337: variable 'interface' from source: play vars 30583 1726853777.27372: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853777.27376: when evaluation is False, skipping this task 30583 1726853777.27378: _execute() done 30583 1726853777.27381: dumping result to json 30583 1726853777.27383: done dumping result, returning 30583 1726853777.27393: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-0000000021b0] 30583 1726853777.27410: sending task result for task 02083763-bbaf-05ea-abc5-0000000021b0 30583 1726853777.27501: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021b0 30583 1726853777.27505: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853777.27551: no more pending results, returning what we have 30583 1726853777.27555: results queue empty 30583 1726853777.27556: checking for any_errors_fatal 30583 1726853777.27589: done checking for any_errors_fatal 30583 1726853777.27590: checking for max_fail_percentage 30583 1726853777.27593: done checking for max_fail_percentage 30583 1726853777.27594: checking to see if all hosts have failed and the running result is not ok 30583 1726853777.27594: done checking to see if all hosts have failed 30583 1726853777.27595: getting the remaining hosts for this loop 30583 1726853777.27597: done getting the remaining hosts for this loop 30583 1726853777.27601: getting the next task for host managed_node2 30583 1726853777.27614: done getting next task for host managed_node2 30583 1726853777.27618: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853777.27623: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853777.27642: getting variables 30583 1726853777.27644: in VariableManager get_vars() 30583 1726853777.27694: Calling all_inventory to load vars for managed_node2 30583 1726853777.27697: Calling groups_inventory to load vars for managed_node2 30583 1726853777.27699: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853777.27708: Calling all_plugins_play to load vars for managed_node2 30583 1726853777.27710: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853777.27713: Calling groups_plugins_play to load vars for managed_node2 30583 1726853777.28561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853777.29774: done with get_vars() 30583 1726853777.29798: done getting variables 30583 1726853777.29852: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:36:17 -0400 (0:00:00.087) 0:01:52.636 ****** 30583 1726853777.29890: entering _queue_task() for managed_node2/service 30583 1726853777.30245: worker is 1 (out of 1 available) 30583 1726853777.30262: exiting _queue_task() for managed_node2/service 30583 1726853777.30476: done queuing things up, now waiting for results queue to drain 30583 1726853777.30478: waiting for pending results... 30583 1726853777.30621: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853777.30724: in run() - task 02083763-bbaf-05ea-abc5-0000000021b1 30583 1726853777.30734: variable 'ansible_search_path' from source: unknown 30583 1726853777.30738: variable 'ansible_search_path' from source: unknown 30583 1726853777.30769: calling self._execute() 30583 1726853777.30858: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853777.30864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853777.30874: variable 'omit' from source: magic vars 30583 1726853777.31167: variable 'ansible_distribution_major_version' from source: facts 30583 1726853777.31179: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853777.31255: variable 'network_provider' from source: set_fact 30583 1726853777.31264: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853777.31268: when evaluation is False, skipping this task 30583 1726853777.31272: _execute() done 30583 1726853777.31274: dumping result to json 30583 1726853777.31277: done dumping result, returning 30583 1726853777.31283: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-0000000021b1] 30583 1726853777.31288: sending task result for task 02083763-bbaf-05ea-abc5-0000000021b1 30583 1726853777.31375: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021b1 30583 1726853777.31379: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853777.31425: no more pending results, returning what we have 30583 1726853777.31428: results queue empty 30583 1726853777.31429: checking for any_errors_fatal 30583 1726853777.31442: done checking for any_errors_fatal 30583 1726853777.31443: checking for max_fail_percentage 30583 1726853777.31445: done checking for max_fail_percentage 30583 1726853777.31446: checking to see if all hosts have failed and the running result is not ok 30583 1726853777.31447: done checking to see if all hosts have failed 30583 1726853777.31447: getting the remaining hosts for this loop 30583 1726853777.31449: done getting the remaining hosts for this loop 30583 1726853777.31453: getting the next task for host managed_node2 30583 1726853777.31462: done getting next task for host managed_node2 30583 1726853777.31466: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853777.31476: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853777.31501: getting variables 30583 1726853777.31503: in VariableManager get_vars() 30583 1726853777.31547: Calling all_inventory to load vars for managed_node2 30583 1726853777.31550: Calling groups_inventory to load vars for managed_node2 30583 1726853777.31552: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853777.31562: Calling all_plugins_play to load vars for managed_node2 30583 1726853777.31565: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853777.31567: Calling groups_plugins_play to load vars for managed_node2 30583 1726853777.32582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853777.33812: done with get_vars() 30583 1726853777.33833: done getting variables 30583 1726853777.33884: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:36:17 -0400 (0:00:00.040) 0:01:52.676 ****** 30583 1726853777.33911: entering _queue_task() for managed_node2/copy 30583 1726853777.34176: worker is 1 (out of 1 available) 30583 1726853777.34188: exiting _queue_task() for managed_node2/copy 30583 1726853777.34201: done queuing things up, now waiting for results queue to drain 30583 1726853777.34203: waiting for pending results... 30583 1726853777.34401: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853777.34512: in run() - task 02083763-bbaf-05ea-abc5-0000000021b2 30583 1726853777.34523: variable 'ansible_search_path' from source: unknown 30583 1726853777.34527: variable 'ansible_search_path' from source: unknown 30583 1726853777.34559: calling self._execute() 30583 1726853777.34636: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853777.34642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853777.34652: variable 'omit' from source: magic vars 30583 1726853777.34940: variable 'ansible_distribution_major_version' from source: facts 30583 1726853777.34949: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853777.35032: variable 'network_provider' from source: set_fact 30583 1726853777.35038: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853777.35041: when evaluation is False, skipping this task 30583 1726853777.35044: _execute() done 30583 1726853777.35046: dumping result to json 30583 1726853777.35051: done dumping result, returning 30583 1726853777.35059: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-0000000021b2] 30583 1726853777.35065: sending task result for task 02083763-bbaf-05ea-abc5-0000000021b2 30583 1726853777.35153: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021b2 30583 1726853777.35156: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853777.35233: no more pending results, returning what we have 30583 1726853777.35237: results queue empty 30583 1726853777.35238: checking for any_errors_fatal 30583 1726853777.35244: done checking for any_errors_fatal 30583 1726853777.35245: checking for max_fail_percentage 30583 1726853777.35246: done checking for max_fail_percentage 30583 1726853777.35247: checking to see if all hosts have failed and the running result is not ok 30583 1726853777.35248: done checking to see if all hosts have failed 30583 1726853777.35249: getting the remaining hosts for this loop 30583 1726853777.35250: done getting the remaining hosts for this loop 30583 1726853777.35254: getting the next task for host managed_node2 30583 1726853777.35262: done getting next task for host managed_node2 30583 1726853777.35265: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853777.35270: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853777.35289: getting variables 30583 1726853777.35291: in VariableManager get_vars() 30583 1726853777.35329: Calling all_inventory to load vars for managed_node2 30583 1726853777.35332: Calling groups_inventory to load vars for managed_node2 30583 1726853777.35334: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853777.35341: Calling all_plugins_play to load vars for managed_node2 30583 1726853777.35344: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853777.35346: Calling groups_plugins_play to load vars for managed_node2 30583 1726853777.36117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853777.36981: done with get_vars() 30583 1726853777.36997: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:36:17 -0400 (0:00:00.031) 0:01:52.707 ****** 30583 1726853777.37058: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853777.37294: worker is 1 (out of 1 available) 30583 1726853777.37309: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853777.37322: done queuing things up, now waiting for results queue to drain 30583 1726853777.37323: waiting for pending results... 30583 1726853777.37509: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853777.37618: in run() - task 02083763-bbaf-05ea-abc5-0000000021b3 30583 1726853777.37628: variable 'ansible_search_path' from source: unknown 30583 1726853777.37632: variable 'ansible_search_path' from source: unknown 30583 1726853777.37664: calling self._execute() 30583 1726853777.37747: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853777.37751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853777.37764: variable 'omit' from source: magic vars 30583 1726853777.38041: variable 'ansible_distribution_major_version' from source: facts 30583 1726853777.38050: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853777.38055: variable 'omit' from source: magic vars 30583 1726853777.38102: variable 'omit' from source: magic vars 30583 1726853777.38214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853777.39906: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853777.39951: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853777.39980: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853777.40005: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853777.40025: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853777.40084: variable 'network_provider' from source: set_fact 30583 1726853777.40180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853777.40200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853777.40217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853777.40243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853777.40254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853777.40309: variable 'omit' from source: magic vars 30583 1726853777.40381: variable 'omit' from source: magic vars 30583 1726853777.40448: variable 'network_connections' from source: include params 30583 1726853777.40461: variable 'interface' from source: play vars 30583 1726853777.40506: variable 'interface' from source: play vars 30583 1726853777.40614: variable 'omit' from source: magic vars 30583 1726853777.40621: variable '__lsr_ansible_managed' from source: task vars 30583 1726853777.40664: variable '__lsr_ansible_managed' from source: task vars 30583 1726853777.40787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853777.40924: Loaded config def from plugin (lookup/template) 30583 1726853777.40929: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853777.40962: File lookup term: get_ansible_managed.j2 30583 1726853777.40965: variable 'ansible_search_path' from source: unknown 30583 1726853777.40968: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853777.40979: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853777.40992: variable 'ansible_search_path' from source: unknown 30583 1726853777.44220: variable 'ansible_managed' from source: unknown 30583 1726853777.44305: variable 'omit' from source: magic vars 30583 1726853777.44326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853777.44345: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853777.44361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853777.44374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853777.44382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853777.44405: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853777.44408: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853777.44411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853777.44469: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853777.44475: Set connection var ansible_timeout to 10 30583 1726853777.44478: Set connection var ansible_connection to ssh 30583 1726853777.44483: Set connection var ansible_shell_executable to /bin/sh 30583 1726853777.44486: Set connection var ansible_shell_type to sh 30583 1726853777.44493: Set connection var ansible_pipelining to False 30583 1726853777.44514: variable 'ansible_shell_executable' from source: unknown 30583 1726853777.44517: variable 'ansible_connection' from source: unknown 30583 1726853777.44520: variable 'ansible_module_compression' from source: unknown 30583 1726853777.44523: variable 'ansible_shell_type' from source: unknown 30583 1726853777.44525: variable 'ansible_shell_executable' from source: unknown 30583 1726853777.44528: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853777.44530: variable 'ansible_pipelining' from source: unknown 30583 1726853777.44533: variable 'ansible_timeout' from source: unknown 30583 1726853777.44534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853777.44624: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853777.44636: variable 'omit' from source: magic vars 30583 1726853777.44639: starting attempt loop 30583 1726853777.44642: running the handler 30583 1726853777.44648: _low_level_execute_command(): starting 30583 1726853777.44654: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853777.45135: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853777.45150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853777.45154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.45175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.45228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853777.45231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853777.45233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853777.45319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853777.47052: stdout chunk (state=3): >>>/root <<< 30583 1726853777.47152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853777.47183: stderr chunk (state=3): >>><<< 30583 1726853777.47187: stdout chunk (state=3): >>><<< 30583 1726853777.47208: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853777.47218: _low_level_execute_command(): starting 30583 1726853777.47226: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814 `" && echo ansible-tmp-1726853777.4720812-35771-139727426807814="` echo /root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814 `" ) && sleep 0' 30583 1726853777.47635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853777.47670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853777.47676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.47679: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853777.47681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.47726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853777.47729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853777.47733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853777.47801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853777.49809: stdout chunk (state=3): >>>ansible-tmp-1726853777.4720812-35771-139727426807814=/root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814 <<< 30583 1726853777.49909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853777.49941: stderr chunk (state=3): >>><<< 30583 1726853777.49944: stdout chunk (state=3): >>><<< 30583 1726853777.49962: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853777.4720812-35771-139727426807814=/root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853777.50000: variable 'ansible_module_compression' from source: unknown 30583 1726853777.50041: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853777.50084: variable 'ansible_facts' from source: unknown 30583 1726853777.50176: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814/AnsiballZ_network_connections.py 30583 1726853777.50277: Sending initial data 30583 1726853777.50288: Sent initial data (168 bytes) 30583 1726853777.50735: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853777.50738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853777.50745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.50747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853777.50750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.50808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853777.50811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853777.50881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853777.52541: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30583 1726853777.52545: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853777.52607: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853777.52683: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp_b1kci5x /root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814/AnsiballZ_network_connections.py <<< 30583 1726853777.52686: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814/AnsiballZ_network_connections.py" <<< 30583 1726853777.52752: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp_b1kci5x" to remote "/root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814/AnsiballZ_network_connections.py" <<< 30583 1726853777.52755: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814/AnsiballZ_network_connections.py" <<< 30583 1726853777.53582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853777.53622: stderr chunk (state=3): >>><<< 30583 1726853777.53625: stdout chunk (state=3): >>><<< 30583 1726853777.53656: done transferring module to remote 30583 1726853777.53667: _low_level_execute_command(): starting 30583 1726853777.53670: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814/ /root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814/AnsiballZ_network_connections.py && sleep 0' 30583 1726853777.54113: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853777.54116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853777.54119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.54121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853777.54123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853777.54124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.54183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853777.54186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853777.54246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853777.56084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853777.56111: stderr chunk (state=3): >>><<< 30583 1726853777.56114: stdout chunk (state=3): >>><<< 30583 1726853777.56125: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853777.56128: _low_level_execute_command(): starting 30583 1726853777.56132: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814/AnsiballZ_network_connections.py && sleep 0' 30583 1726853777.56531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853777.56540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853777.56567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.56574: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853777.56576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.56627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853777.56633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853777.56636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853777.56703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853777.86224: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30583 1726853777.88831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853777.88859: stderr chunk (state=3): >>><<< 30583 1726853777.88862: stdout chunk (state=3): >>><<< 30583 1726853777.88884: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853777.88915: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853777.88925: _low_level_execute_command(): starting 30583 1726853777.88928: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853777.4720812-35771-139727426807814/ > /dev/null 2>&1 && sleep 0' 30583 1726853777.89392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853777.89395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.89398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853777.89400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853777.89452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853777.89455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853777.89463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853777.89532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853777.91431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853777.91460: stderr chunk (state=3): >>><<< 30583 1726853777.91464: stdout chunk (state=3): >>><<< 30583 1726853777.91476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853777.91482: handler run complete 30583 1726853777.91507: attempt loop complete, returning result 30583 1726853777.91510: _execute() done 30583 1726853777.91512: dumping result to json 30583 1726853777.91519: done dumping result, returning 30583 1726853777.91526: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-0000000021b3] 30583 1726853777.91530: sending task result for task 02083763-bbaf-05ea-abc5-0000000021b3 30583 1726853777.91637: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021b3 30583 1726853777.91640: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd 30583 1726853777.91762: no more pending results, returning what we have 30583 1726853777.91766: results queue empty 30583 1726853777.91767: checking for any_errors_fatal 30583 1726853777.91774: done checking for any_errors_fatal 30583 1726853777.91775: checking for max_fail_percentage 30583 1726853777.91777: done checking for max_fail_percentage 30583 1726853777.91778: checking to see if all hosts have failed and the running result is not ok 30583 1726853777.91779: done checking to see if all hosts have failed 30583 1726853777.91779: getting the remaining hosts for this loop 30583 1726853777.91782: done getting the remaining hosts for this loop 30583 1726853777.91785: getting the next task for host managed_node2 30583 1726853777.91792: done getting next task for host managed_node2 30583 1726853777.91795: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853777.91800: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853777.91812: getting variables 30583 1726853777.91813: in VariableManager get_vars() 30583 1726853777.91853: Calling all_inventory to load vars for managed_node2 30583 1726853777.91856: Calling groups_inventory to load vars for managed_node2 30583 1726853777.91861: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853777.91870: Calling all_plugins_play to load vars for managed_node2 30583 1726853777.91880: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853777.91883: Calling groups_plugins_play to load vars for managed_node2 30583 1726853777.92824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853777.93697: done with get_vars() 30583 1726853777.93715: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:36:17 -0400 (0:00:00.567) 0:01:53.275 ****** 30583 1726853777.93781: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853777.94045: worker is 1 (out of 1 available) 30583 1726853777.94063: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853777.94079: done queuing things up, now waiting for results queue to drain 30583 1726853777.94081: waiting for pending results... 30583 1726853777.94273: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853777.94376: in run() - task 02083763-bbaf-05ea-abc5-0000000021b4 30583 1726853777.94389: variable 'ansible_search_path' from source: unknown 30583 1726853777.94392: variable 'ansible_search_path' from source: unknown 30583 1726853777.94424: calling self._execute() 30583 1726853777.94502: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853777.94506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853777.94514: variable 'omit' from source: magic vars 30583 1726853777.94800: variable 'ansible_distribution_major_version' from source: facts 30583 1726853777.94808: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853777.94898: variable 'network_state' from source: role '' defaults 30583 1726853777.94906: Evaluated conditional (network_state != {}): False 30583 1726853777.94909: when evaluation is False, skipping this task 30583 1726853777.94912: _execute() done 30583 1726853777.94915: dumping result to json 30583 1726853777.94917: done dumping result, returning 30583 1726853777.94926: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-0000000021b4] 30583 1726853777.94928: sending task result for task 02083763-bbaf-05ea-abc5-0000000021b4 30583 1726853777.95017: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021b4 30583 1726853777.95019: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853777.95077: no more pending results, returning what we have 30583 1726853777.95081: results queue empty 30583 1726853777.95082: checking for any_errors_fatal 30583 1726853777.95095: done checking for any_errors_fatal 30583 1726853777.95096: checking for max_fail_percentage 30583 1726853777.95098: done checking for max_fail_percentage 30583 1726853777.95099: checking to see if all hosts have failed and the running result is not ok 30583 1726853777.95100: done checking to see if all hosts have failed 30583 1726853777.95100: getting the remaining hosts for this loop 30583 1726853777.95102: done getting the remaining hosts for this loop 30583 1726853777.95105: getting the next task for host managed_node2 30583 1726853777.95113: done getting next task for host managed_node2 30583 1726853777.95117: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853777.95121: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853777.95147: getting variables 30583 1726853777.95149: in VariableManager get_vars() 30583 1726853777.95191: Calling all_inventory to load vars for managed_node2 30583 1726853777.95193: Calling groups_inventory to load vars for managed_node2 30583 1726853777.95196: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853777.95204: Calling all_plugins_play to load vars for managed_node2 30583 1726853777.95206: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853777.95208: Calling groups_plugins_play to load vars for managed_node2 30583 1726853777.95994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853777.96969: done with get_vars() 30583 1726853777.96988: done getting variables 30583 1726853777.97030: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:36:17 -0400 (0:00:00.032) 0:01:53.307 ****** 30583 1726853777.97057: entering _queue_task() for managed_node2/debug 30583 1726853777.97307: worker is 1 (out of 1 available) 30583 1726853777.97320: exiting _queue_task() for managed_node2/debug 30583 1726853777.97334: done queuing things up, now waiting for results queue to drain 30583 1726853777.97335: waiting for pending results... 30583 1726853777.97524: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853777.97625: in run() - task 02083763-bbaf-05ea-abc5-0000000021b5 30583 1726853777.97637: variable 'ansible_search_path' from source: unknown 30583 1726853777.97640: variable 'ansible_search_path' from source: unknown 30583 1726853777.97669: calling self._execute() 30583 1726853777.97744: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853777.97749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853777.97756: variable 'omit' from source: magic vars 30583 1726853777.98034: variable 'ansible_distribution_major_version' from source: facts 30583 1726853777.98043: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853777.98050: variable 'omit' from source: magic vars 30583 1726853777.98099: variable 'omit' from source: magic vars 30583 1726853777.98126: variable 'omit' from source: magic vars 30583 1726853777.98162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853777.98188: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853777.98204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853777.98219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853777.98229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853777.98253: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853777.98256: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853777.98261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853777.98329: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853777.98332: Set connection var ansible_timeout to 10 30583 1726853777.98335: Set connection var ansible_connection to ssh 30583 1726853777.98344: Set connection var ansible_shell_executable to /bin/sh 30583 1726853777.98347: Set connection var ansible_shell_type to sh 30583 1726853777.98351: Set connection var ansible_pipelining to False 30583 1726853777.98373: variable 'ansible_shell_executable' from source: unknown 30583 1726853777.98376: variable 'ansible_connection' from source: unknown 30583 1726853777.98379: variable 'ansible_module_compression' from source: unknown 30583 1726853777.98381: variable 'ansible_shell_type' from source: unknown 30583 1726853777.98383: variable 'ansible_shell_executable' from source: unknown 30583 1726853777.98386: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853777.98390: variable 'ansible_pipelining' from source: unknown 30583 1726853777.98393: variable 'ansible_timeout' from source: unknown 30583 1726853777.98397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853777.98501: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853777.98510: variable 'omit' from source: magic vars 30583 1726853777.98515: starting attempt loop 30583 1726853777.98517: running the handler 30583 1726853777.98616: variable '__network_connections_result' from source: set_fact 30583 1726853777.98656: handler run complete 30583 1726853777.98675: attempt loop complete, returning result 30583 1726853777.98680: _execute() done 30583 1726853777.98683: dumping result to json 30583 1726853777.98686: done dumping result, returning 30583 1726853777.98695: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-0000000021b5] 30583 1726853777.98697: sending task result for task 02083763-bbaf-05ea-abc5-0000000021b5 30583 1726853777.98784: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021b5 30583 1726853777.98786: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd" ] } 30583 1726853777.98850: no more pending results, returning what we have 30583 1726853777.98854: results queue empty 30583 1726853777.98855: checking for any_errors_fatal 30583 1726853777.98863: done checking for any_errors_fatal 30583 1726853777.98864: checking for max_fail_percentage 30583 1726853777.98866: done checking for max_fail_percentage 30583 1726853777.98867: checking to see if all hosts have failed and the running result is not ok 30583 1726853777.98867: done checking to see if all hosts have failed 30583 1726853777.98868: getting the remaining hosts for this loop 30583 1726853777.98870: done getting the remaining hosts for this loop 30583 1726853777.98875: getting the next task for host managed_node2 30583 1726853777.98883: done getting next task for host managed_node2 30583 1726853777.98887: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853777.98891: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853777.98903: getting variables 30583 1726853777.98905: in VariableManager get_vars() 30583 1726853777.98947: Calling all_inventory to load vars for managed_node2 30583 1726853777.98949: Calling groups_inventory to load vars for managed_node2 30583 1726853777.98952: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853777.98960: Calling all_plugins_play to load vars for managed_node2 30583 1726853777.98963: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853777.98965: Calling groups_plugins_play to load vars for managed_node2 30583 1726853777.99754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853778.00614: done with get_vars() 30583 1726853778.00632: done getting variables 30583 1726853778.00677: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:36:18 -0400 (0:00:00.036) 0:01:53.344 ****** 30583 1726853778.00706: entering _queue_task() for managed_node2/debug 30583 1726853778.00951: worker is 1 (out of 1 available) 30583 1726853778.00965: exiting _queue_task() for managed_node2/debug 30583 1726853778.00981: done queuing things up, now waiting for results queue to drain 30583 1726853778.00982: waiting for pending results... 30583 1726853778.01178: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853778.01276: in run() - task 02083763-bbaf-05ea-abc5-0000000021b6 30583 1726853778.01287: variable 'ansible_search_path' from source: unknown 30583 1726853778.01291: variable 'ansible_search_path' from source: unknown 30583 1726853778.01323: calling self._execute() 30583 1726853778.01400: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.01403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.01412: variable 'omit' from source: magic vars 30583 1726853778.01694: variable 'ansible_distribution_major_version' from source: facts 30583 1726853778.01704: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853778.01709: variable 'omit' from source: magic vars 30583 1726853778.01757: variable 'omit' from source: magic vars 30583 1726853778.01781: variable 'omit' from source: magic vars 30583 1726853778.01817: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853778.01843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853778.01863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853778.01880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853778.01889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853778.01913: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853778.01916: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.01919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.01995: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853778.02000: Set connection var ansible_timeout to 10 30583 1726853778.02003: Set connection var ansible_connection to ssh 30583 1726853778.02008: Set connection var ansible_shell_executable to /bin/sh 30583 1726853778.02010: Set connection var ansible_shell_type to sh 30583 1726853778.02018: Set connection var ansible_pipelining to False 30583 1726853778.02035: variable 'ansible_shell_executable' from source: unknown 30583 1726853778.02038: variable 'ansible_connection' from source: unknown 30583 1726853778.02041: variable 'ansible_module_compression' from source: unknown 30583 1726853778.02043: variable 'ansible_shell_type' from source: unknown 30583 1726853778.02045: variable 'ansible_shell_executable' from source: unknown 30583 1726853778.02047: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.02052: variable 'ansible_pipelining' from source: unknown 30583 1726853778.02054: variable 'ansible_timeout' from source: unknown 30583 1726853778.02058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.02160: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853778.02173: variable 'omit' from source: magic vars 30583 1726853778.02178: starting attempt loop 30583 1726853778.02182: running the handler 30583 1726853778.02224: variable '__network_connections_result' from source: set_fact 30583 1726853778.02282: variable '__network_connections_result' from source: set_fact 30583 1726853778.02364: handler run complete 30583 1726853778.02384: attempt loop complete, returning result 30583 1726853778.02387: _execute() done 30583 1726853778.02390: dumping result to json 30583 1726853778.02393: done dumping result, returning 30583 1726853778.02406: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-0000000021b6] 30583 1726853778.02409: sending task result for task 02083763-bbaf-05ea-abc5-0000000021b6 30583 1726853778.02495: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021b6 30583 1726853778.02498: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd" ] } } 30583 1726853778.02595: no more pending results, returning what we have 30583 1726853778.02598: results queue empty 30583 1726853778.02599: checking for any_errors_fatal 30583 1726853778.02605: done checking for any_errors_fatal 30583 1726853778.02606: checking for max_fail_percentage 30583 1726853778.02607: done checking for max_fail_percentage 30583 1726853778.02608: checking to see if all hosts have failed and the running result is not ok 30583 1726853778.02609: done checking to see if all hosts have failed 30583 1726853778.02610: getting the remaining hosts for this loop 30583 1726853778.02612: done getting the remaining hosts for this loop 30583 1726853778.02615: getting the next task for host managed_node2 30583 1726853778.02622: done getting next task for host managed_node2 30583 1726853778.02626: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853778.02629: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853778.02640: getting variables 30583 1726853778.02641: in VariableManager get_vars() 30583 1726853778.02690: Calling all_inventory to load vars for managed_node2 30583 1726853778.02693: Calling groups_inventory to load vars for managed_node2 30583 1726853778.02695: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853778.02702: Calling all_plugins_play to load vars for managed_node2 30583 1726853778.02705: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853778.02707: Calling groups_plugins_play to load vars for managed_node2 30583 1726853778.03592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853778.08817: done with get_vars() 30583 1726853778.08838: done getting variables 30583 1726853778.08875: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:36:18 -0400 (0:00:00.081) 0:01:53.426 ****** 30583 1726853778.08899: entering _queue_task() for managed_node2/debug 30583 1726853778.09183: worker is 1 (out of 1 available) 30583 1726853778.09197: exiting _queue_task() for managed_node2/debug 30583 1726853778.09209: done queuing things up, now waiting for results queue to drain 30583 1726853778.09211: waiting for pending results... 30583 1726853778.09405: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853778.09517: in run() - task 02083763-bbaf-05ea-abc5-0000000021b7 30583 1726853778.09528: variable 'ansible_search_path' from source: unknown 30583 1726853778.09532: variable 'ansible_search_path' from source: unknown 30583 1726853778.09566: calling self._execute() 30583 1726853778.09639: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.09644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.09658: variable 'omit' from source: magic vars 30583 1726853778.09951: variable 'ansible_distribution_major_version' from source: facts 30583 1726853778.09960: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853778.10054: variable 'network_state' from source: role '' defaults 30583 1726853778.10065: Evaluated conditional (network_state != {}): False 30583 1726853778.10069: when evaluation is False, skipping this task 30583 1726853778.10075: _execute() done 30583 1726853778.10080: dumping result to json 30583 1726853778.10083: done dumping result, returning 30583 1726853778.10095: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-0000000021b7] 30583 1726853778.10098: sending task result for task 02083763-bbaf-05ea-abc5-0000000021b7 30583 1726853778.10197: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021b7 30583 1726853778.10200: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853778.10247: no more pending results, returning what we have 30583 1726853778.10251: results queue empty 30583 1726853778.10252: checking for any_errors_fatal 30583 1726853778.10267: done checking for any_errors_fatal 30583 1726853778.10268: checking for max_fail_percentage 30583 1726853778.10275: done checking for max_fail_percentage 30583 1726853778.10277: checking to see if all hosts have failed and the running result is not ok 30583 1726853778.10277: done checking to see if all hosts have failed 30583 1726853778.10278: getting the remaining hosts for this loop 30583 1726853778.10280: done getting the remaining hosts for this loop 30583 1726853778.10284: getting the next task for host managed_node2 30583 1726853778.10291: done getting next task for host managed_node2 30583 1726853778.10295: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853778.10300: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853778.10322: getting variables 30583 1726853778.10323: in VariableManager get_vars() 30583 1726853778.10362: Calling all_inventory to load vars for managed_node2 30583 1726853778.10365: Calling groups_inventory to load vars for managed_node2 30583 1726853778.10367: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853778.10385: Calling all_plugins_play to load vars for managed_node2 30583 1726853778.10388: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853778.10391: Calling groups_plugins_play to load vars for managed_node2 30583 1726853778.11148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853778.12020: done with get_vars() 30583 1726853778.12034: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:36:18 -0400 (0:00:00.032) 0:01:53.458 ****** 30583 1726853778.12110: entering _queue_task() for managed_node2/ping 30583 1726853778.12340: worker is 1 (out of 1 available) 30583 1726853778.12353: exiting _queue_task() for managed_node2/ping 30583 1726853778.12366: done queuing things up, now waiting for results queue to drain 30583 1726853778.12368: waiting for pending results... 30583 1726853778.12554: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853778.12655: in run() - task 02083763-bbaf-05ea-abc5-0000000021b8 30583 1726853778.12669: variable 'ansible_search_path' from source: unknown 30583 1726853778.12674: variable 'ansible_search_path' from source: unknown 30583 1726853778.12710: calling self._execute() 30583 1726853778.12782: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.12786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.12794: variable 'omit' from source: magic vars 30583 1726853778.13084: variable 'ansible_distribution_major_version' from source: facts 30583 1726853778.13093: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853778.13099: variable 'omit' from source: magic vars 30583 1726853778.13144: variable 'omit' from source: magic vars 30583 1726853778.13165: variable 'omit' from source: magic vars 30583 1726853778.13199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853778.13225: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853778.13241: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853778.13257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853778.13269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853778.13293: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853778.13297: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.13299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.13368: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853778.13374: Set connection var ansible_timeout to 10 30583 1726853778.13377: Set connection var ansible_connection to ssh 30583 1726853778.13382: Set connection var ansible_shell_executable to /bin/sh 30583 1726853778.13385: Set connection var ansible_shell_type to sh 30583 1726853778.13392: Set connection var ansible_pipelining to False 30583 1726853778.13410: variable 'ansible_shell_executable' from source: unknown 30583 1726853778.13413: variable 'ansible_connection' from source: unknown 30583 1726853778.13415: variable 'ansible_module_compression' from source: unknown 30583 1726853778.13418: variable 'ansible_shell_type' from source: unknown 30583 1726853778.13420: variable 'ansible_shell_executable' from source: unknown 30583 1726853778.13422: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.13426: variable 'ansible_pipelining' from source: unknown 30583 1726853778.13428: variable 'ansible_timeout' from source: unknown 30583 1726853778.13432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.13580: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853778.13590: variable 'omit' from source: magic vars 30583 1726853778.13595: starting attempt loop 30583 1726853778.13598: running the handler 30583 1726853778.13609: _low_level_execute_command(): starting 30583 1726853778.13616: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853778.14134: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853778.14138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.14141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853778.14143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.14176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853778.14191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853778.14279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853778.15990: stdout chunk (state=3): >>>/root <<< 30583 1726853778.16091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853778.16117: stderr chunk (state=3): >>><<< 30583 1726853778.16121: stdout chunk (state=3): >>><<< 30583 1726853778.16139: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853778.16149: _low_level_execute_command(): starting 30583 1726853778.16154: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438 `" && echo ansible-tmp-1726853778.161383-35786-194110087209438="` echo /root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438 `" ) && sleep 0' 30583 1726853778.16564: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853778.16567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853778.16598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853778.16601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853778.16612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853778.16614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.16661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853778.16664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853778.16740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853778.18692: stdout chunk (state=3): >>>ansible-tmp-1726853778.161383-35786-194110087209438=/root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438 <<< 30583 1726853778.18801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853778.18826: stderr chunk (state=3): >>><<< 30583 1726853778.18829: stdout chunk (state=3): >>><<< 30583 1726853778.18844: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853778.161383-35786-194110087209438=/root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853778.18883: variable 'ansible_module_compression' from source: unknown 30583 1726853778.18916: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853778.18944: variable 'ansible_facts' from source: unknown 30583 1726853778.18998: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438/AnsiballZ_ping.py 30583 1726853778.19096: Sending initial data 30583 1726853778.19099: Sent initial data (152 bytes) 30583 1726853778.19532: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853778.19537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853778.19540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853778.19542: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853778.19544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.19595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853778.19602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853778.19604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853778.19675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853778.21284: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 30583 1726853778.21287: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853778.21344: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853778.21419: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpidf20g_i /root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438/AnsiballZ_ping.py <<< 30583 1726853778.21422: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438/AnsiballZ_ping.py" <<< 30583 1726853778.21489: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 30583 1726853778.21493: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpidf20g_i" to remote "/root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438/AnsiballZ_ping.py" <<< 30583 1726853778.22136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853778.22151: stderr chunk (state=3): >>><<< 30583 1726853778.22154: stdout chunk (state=3): >>><<< 30583 1726853778.22191: done transferring module to remote 30583 1726853778.22200: _low_level_execute_command(): starting 30583 1726853778.22205: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438/ /root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438/AnsiballZ_ping.py && sleep 0' 30583 1726853778.22650: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853778.22653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853778.22655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.22657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853778.22659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.22715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853778.22721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853778.22789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853778.24664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853778.24692: stderr chunk (state=3): >>><<< 30583 1726853778.24695: stdout chunk (state=3): >>><<< 30583 1726853778.24710: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853778.24712: _low_level_execute_command(): starting 30583 1726853778.24717: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438/AnsiballZ_ping.py && sleep 0' 30583 1726853778.25144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853778.25147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853778.25185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853778.25188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.25190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853778.25192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853778.25194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.25247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853778.25250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853778.25255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853778.25335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853778.40869: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853778.42260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853778.42289: stderr chunk (state=3): >>><<< 30583 1726853778.42292: stdout chunk (state=3): >>><<< 30583 1726853778.42309: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853778.42332: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853778.42341: _low_level_execute_command(): starting 30583 1726853778.42346: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853778.161383-35786-194110087209438/ > /dev/null 2>&1 && sleep 0' 30583 1726853778.42805: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853778.42809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.42811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853778.42813: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853778.42815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.42873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853778.42880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853778.42882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853778.42949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853778.44827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853778.44854: stderr chunk (state=3): >>><<< 30583 1726853778.44859: stdout chunk (state=3): >>><<< 30583 1726853778.44875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853778.44879: handler run complete 30583 1726853778.44893: attempt loop complete, returning result 30583 1726853778.44896: _execute() done 30583 1726853778.44899: dumping result to json 30583 1726853778.44901: done dumping result, returning 30583 1726853778.44909: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-0000000021b8] 30583 1726853778.44912: sending task result for task 02083763-bbaf-05ea-abc5-0000000021b8 30583 1726853778.45004: done sending task result for task 02083763-bbaf-05ea-abc5-0000000021b8 30583 1726853778.45006: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853778.45074: no more pending results, returning what we have 30583 1726853778.45078: results queue empty 30583 1726853778.45079: checking for any_errors_fatal 30583 1726853778.45086: done checking for any_errors_fatal 30583 1726853778.45087: checking for max_fail_percentage 30583 1726853778.45088: done checking for max_fail_percentage 30583 1726853778.45089: checking to see if all hosts have failed and the running result is not ok 30583 1726853778.45090: done checking to see if all hosts have failed 30583 1726853778.45091: getting the remaining hosts for this loop 30583 1726853778.45092: done getting the remaining hosts for this loop 30583 1726853778.45095: getting the next task for host managed_node2 30583 1726853778.45109: done getting next task for host managed_node2 30583 1726853778.45111: ^ task is: TASK: meta (role_complete) 30583 1726853778.45117: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853778.45129: getting variables 30583 1726853778.45131: in VariableManager get_vars() 30583 1726853778.45187: Calling all_inventory to load vars for managed_node2 30583 1726853778.45189: Calling groups_inventory to load vars for managed_node2 30583 1726853778.45192: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853778.45201: Calling all_plugins_play to load vars for managed_node2 30583 1726853778.45204: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853778.45206: Calling groups_plugins_play to load vars for managed_node2 30583 1726853778.46167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853778.47033: done with get_vars() 30583 1726853778.47052: done getting variables 30583 1726853778.47115: done queuing things up, now waiting for results queue to drain 30583 1726853778.47117: results queue empty 30583 1726853778.47117: checking for any_errors_fatal 30583 1726853778.47120: done checking for any_errors_fatal 30583 1726853778.47120: checking for max_fail_percentage 30583 1726853778.47121: done checking for max_fail_percentage 30583 1726853778.47122: checking to see if all hosts have failed and the running result is not ok 30583 1726853778.47123: done checking to see if all hosts have failed 30583 1726853778.47123: getting the remaining hosts for this loop 30583 1726853778.47124: done getting the remaining hosts for this loop 30583 1726853778.47126: getting the next task for host managed_node2 30583 1726853778.47130: done getting next task for host managed_node2 30583 1726853778.47131: ^ task is: TASK: Show result 30583 1726853778.47133: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853778.47135: getting variables 30583 1726853778.47135: in VariableManager get_vars() 30583 1726853778.47144: Calling all_inventory to load vars for managed_node2 30583 1726853778.47145: Calling groups_inventory to load vars for managed_node2 30583 1726853778.47147: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853778.47150: Calling all_plugins_play to load vars for managed_node2 30583 1726853778.47151: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853778.47153: Calling groups_plugins_play to load vars for managed_node2 30583 1726853778.47795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853778.48668: done with get_vars() 30583 1726853778.48683: done getting variables 30583 1726853778.48714: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 13:36:18 -0400 (0:00:00.366) 0:01:53.824 ****** 30583 1726853778.48741: entering _queue_task() for managed_node2/debug 30583 1726853778.49016: worker is 1 (out of 1 available) 30583 1726853778.49031: exiting _queue_task() for managed_node2/debug 30583 1726853778.49046: done queuing things up, now waiting for results queue to drain 30583 1726853778.49047: waiting for pending results... 30583 1726853778.49246: running TaskExecutor() for managed_node2/TASK: Show result 30583 1726853778.49332: in run() - task 02083763-bbaf-05ea-abc5-00000000213a 30583 1726853778.49344: variable 'ansible_search_path' from source: unknown 30583 1726853778.49348: variable 'ansible_search_path' from source: unknown 30583 1726853778.49382: calling self._execute() 30583 1726853778.49462: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.49466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.49475: variable 'omit' from source: magic vars 30583 1726853778.49770: variable 'ansible_distribution_major_version' from source: facts 30583 1726853778.49781: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853778.49786: variable 'omit' from source: magic vars 30583 1726853778.49820: variable 'omit' from source: magic vars 30583 1726853778.49847: variable 'omit' from source: magic vars 30583 1726853778.49884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853778.49911: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853778.49926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853778.49942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853778.49951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853778.49980: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853778.49984: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.49987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.50060: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853778.50064: Set connection var ansible_timeout to 10 30583 1726853778.50067: Set connection var ansible_connection to ssh 30583 1726853778.50069: Set connection var ansible_shell_executable to /bin/sh 30583 1726853778.50074: Set connection var ansible_shell_type to sh 30583 1726853778.50082: Set connection var ansible_pipelining to False 30583 1726853778.50103: variable 'ansible_shell_executable' from source: unknown 30583 1726853778.50106: variable 'ansible_connection' from source: unknown 30583 1726853778.50109: variable 'ansible_module_compression' from source: unknown 30583 1726853778.50111: variable 'ansible_shell_type' from source: unknown 30583 1726853778.50114: variable 'ansible_shell_executable' from source: unknown 30583 1726853778.50116: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.50118: variable 'ansible_pipelining' from source: unknown 30583 1726853778.50120: variable 'ansible_timeout' from source: unknown 30583 1726853778.50125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.50228: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853778.50238: variable 'omit' from source: magic vars 30583 1726853778.50243: starting attempt loop 30583 1726853778.50246: running the handler 30583 1726853778.50290: variable '__network_connections_result' from source: set_fact 30583 1726853778.50347: variable '__network_connections_result' from source: set_fact 30583 1726853778.50433: handler run complete 30583 1726853778.50449: attempt loop complete, returning result 30583 1726853778.50453: _execute() done 30583 1726853778.50456: dumping result to json 30583 1726853778.50462: done dumping result, returning 30583 1726853778.50467: done running TaskExecutor() for managed_node2/TASK: Show result [02083763-bbaf-05ea-abc5-00000000213a] 30583 1726853778.50473: sending task result for task 02083763-bbaf-05ea-abc5-00000000213a 30583 1726853778.50565: done sending task result for task 02083763-bbaf-05ea-abc5-00000000213a 30583 1726853778.50568: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd" ] } } 30583 1726853778.50651: no more pending results, returning what we have 30583 1726853778.50655: results queue empty 30583 1726853778.50656: checking for any_errors_fatal 30583 1726853778.50660: done checking for any_errors_fatal 30583 1726853778.50661: checking for max_fail_percentage 30583 1726853778.50663: done checking for max_fail_percentage 30583 1726853778.50663: checking to see if all hosts have failed and the running result is not ok 30583 1726853778.50664: done checking to see if all hosts have failed 30583 1726853778.50665: getting the remaining hosts for this loop 30583 1726853778.50667: done getting the remaining hosts for this loop 30583 1726853778.50672: getting the next task for host managed_node2 30583 1726853778.50683: done getting next task for host managed_node2 30583 1726853778.50686: ^ task is: TASK: Include network role 30583 1726853778.50690: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853778.50693: getting variables 30583 1726853778.50695: in VariableManager get_vars() 30583 1726853778.50731: Calling all_inventory to load vars for managed_node2 30583 1726853778.50734: Calling groups_inventory to load vars for managed_node2 30583 1726853778.50737: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853778.50746: Calling all_plugins_play to load vars for managed_node2 30583 1726853778.50749: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853778.50751: Calling groups_plugins_play to load vars for managed_node2 30583 1726853778.51674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853778.52533: done with get_vars() 30583 1726853778.52549: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 13:36:18 -0400 (0:00:00.038) 0:01:53.863 ****** 30583 1726853778.52621: entering _queue_task() for managed_node2/include_role 30583 1726853778.52866: worker is 1 (out of 1 available) 30583 1726853778.52881: exiting _queue_task() for managed_node2/include_role 30583 1726853778.52895: done queuing things up, now waiting for results queue to drain 30583 1726853778.52896: waiting for pending results... 30583 1726853778.53107: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853778.53203: in run() - task 02083763-bbaf-05ea-abc5-00000000213e 30583 1726853778.53220: variable 'ansible_search_path' from source: unknown 30583 1726853778.53223: variable 'ansible_search_path' from source: unknown 30583 1726853778.53252: calling self._execute() 30583 1726853778.53336: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.53340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.53350: variable 'omit' from source: magic vars 30583 1726853778.53644: variable 'ansible_distribution_major_version' from source: facts 30583 1726853778.53654: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853778.53660: _execute() done 30583 1726853778.53667: dumping result to json 30583 1726853778.53675: done dumping result, returning 30583 1726853778.53682: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-00000000213e] 30583 1726853778.53684: sending task result for task 02083763-bbaf-05ea-abc5-00000000213e 30583 1726853778.53790: done sending task result for task 02083763-bbaf-05ea-abc5-00000000213e 30583 1726853778.53793: WORKER PROCESS EXITING 30583 1726853778.53824: no more pending results, returning what we have 30583 1726853778.53829: in VariableManager get_vars() 30583 1726853778.53877: Calling all_inventory to load vars for managed_node2 30583 1726853778.53880: Calling groups_inventory to load vars for managed_node2 30583 1726853778.53884: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853778.53895: Calling all_plugins_play to load vars for managed_node2 30583 1726853778.53899: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853778.53908: Calling groups_plugins_play to load vars for managed_node2 30583 1726853778.54702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853778.55682: done with get_vars() 30583 1726853778.55695: variable 'ansible_search_path' from source: unknown 30583 1726853778.55696: variable 'ansible_search_path' from source: unknown 30583 1726853778.55790: variable 'omit' from source: magic vars 30583 1726853778.55818: variable 'omit' from source: magic vars 30583 1726853778.55827: variable 'omit' from source: magic vars 30583 1726853778.55829: we have included files to process 30583 1726853778.55830: generating all_blocks data 30583 1726853778.55831: done generating all_blocks data 30583 1726853778.55834: processing included file: fedora.linux_system_roles.network 30583 1726853778.55847: in VariableManager get_vars() 30583 1726853778.55858: done with get_vars() 30583 1726853778.55881: in VariableManager get_vars() 30583 1726853778.55893: done with get_vars() 30583 1726853778.55919: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853778.55992: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853778.56037: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853778.56302: in VariableManager get_vars() 30583 1726853778.56316: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853778.57524: iterating over new_blocks loaded from include file 30583 1726853778.57526: in VariableManager get_vars() 30583 1726853778.57537: done with get_vars() 30583 1726853778.57538: filtering new block on tags 30583 1726853778.57694: done filtering new block on tags 30583 1726853778.57697: in VariableManager get_vars() 30583 1726853778.57707: done with get_vars() 30583 1726853778.57708: filtering new block on tags 30583 1726853778.57720: done filtering new block on tags 30583 1726853778.57722: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853778.57726: extending task lists for all hosts with included blocks 30583 1726853778.57790: done extending task lists 30583 1726853778.57791: done processing included files 30583 1726853778.57791: results queue empty 30583 1726853778.57792: checking for any_errors_fatal 30583 1726853778.57795: done checking for any_errors_fatal 30583 1726853778.57795: checking for max_fail_percentage 30583 1726853778.57796: done checking for max_fail_percentage 30583 1726853778.57797: checking to see if all hosts have failed and the running result is not ok 30583 1726853778.57797: done checking to see if all hosts have failed 30583 1726853778.57798: getting the remaining hosts for this loop 30583 1726853778.57799: done getting the remaining hosts for this loop 30583 1726853778.57800: getting the next task for host managed_node2 30583 1726853778.57803: done getting next task for host managed_node2 30583 1726853778.57805: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853778.57807: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853778.57814: getting variables 30583 1726853778.57814: in VariableManager get_vars() 30583 1726853778.57825: Calling all_inventory to load vars for managed_node2 30583 1726853778.57827: Calling groups_inventory to load vars for managed_node2 30583 1726853778.57828: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853778.57832: Calling all_plugins_play to load vars for managed_node2 30583 1726853778.57833: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853778.57836: Calling groups_plugins_play to load vars for managed_node2 30583 1726853778.58475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853778.59329: done with get_vars() 30583 1726853778.59346: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:36:18 -0400 (0:00:00.067) 0:01:53.931 ****** 30583 1726853778.59398: entering _queue_task() for managed_node2/include_tasks 30583 1726853778.59667: worker is 1 (out of 1 available) 30583 1726853778.59684: exiting _queue_task() for managed_node2/include_tasks 30583 1726853778.59697: done queuing things up, now waiting for results queue to drain 30583 1726853778.59699: waiting for pending results... 30583 1726853778.59900: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853778.60002: in run() - task 02083763-bbaf-05ea-abc5-000000002328 30583 1726853778.60012: variable 'ansible_search_path' from source: unknown 30583 1726853778.60015: variable 'ansible_search_path' from source: unknown 30583 1726853778.60046: calling self._execute() 30583 1726853778.60125: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.60129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.60139: variable 'omit' from source: magic vars 30583 1726853778.60426: variable 'ansible_distribution_major_version' from source: facts 30583 1726853778.60434: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853778.60440: _execute() done 30583 1726853778.60443: dumping result to json 30583 1726853778.60446: done dumping result, returning 30583 1726853778.60486: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-000000002328] 30583 1726853778.60490: sending task result for task 02083763-bbaf-05ea-abc5-000000002328 30583 1726853778.60556: done sending task result for task 02083763-bbaf-05ea-abc5-000000002328 30583 1726853778.60559: WORKER PROCESS EXITING 30583 1726853778.60613: no more pending results, returning what we have 30583 1726853778.60619: in VariableManager get_vars() 30583 1726853778.60675: Calling all_inventory to load vars for managed_node2 30583 1726853778.60678: Calling groups_inventory to load vars for managed_node2 30583 1726853778.60681: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853778.60691: Calling all_plugins_play to load vars for managed_node2 30583 1726853778.60694: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853778.60697: Calling groups_plugins_play to load vars for managed_node2 30583 1726853778.61596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853778.62598: done with get_vars() 30583 1726853778.62619: variable 'ansible_search_path' from source: unknown 30583 1726853778.62620: variable 'ansible_search_path' from source: unknown 30583 1726853778.62663: we have included files to process 30583 1726853778.62665: generating all_blocks data 30583 1726853778.62667: done generating all_blocks data 30583 1726853778.62669: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853778.62670: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853778.62675: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853778.63292: done processing included file 30583 1726853778.63293: iterating over new_blocks loaded from include file 30583 1726853778.63294: in VariableManager get_vars() 30583 1726853778.63311: done with get_vars() 30583 1726853778.63312: filtering new block on tags 30583 1726853778.63331: done filtering new block on tags 30583 1726853778.63333: in VariableManager get_vars() 30583 1726853778.63350: done with get_vars() 30583 1726853778.63351: filtering new block on tags 30583 1726853778.63386: done filtering new block on tags 30583 1726853778.63389: in VariableManager get_vars() 30583 1726853778.63415: done with get_vars() 30583 1726853778.63417: filtering new block on tags 30583 1726853778.63462: done filtering new block on tags 30583 1726853778.63464: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853778.63470: extending task lists for all hosts with included blocks 30583 1726853778.65326: done extending task lists 30583 1726853778.65328: done processing included files 30583 1726853778.65329: results queue empty 30583 1726853778.65330: checking for any_errors_fatal 30583 1726853778.65332: done checking for any_errors_fatal 30583 1726853778.65333: checking for max_fail_percentage 30583 1726853778.65335: done checking for max_fail_percentage 30583 1726853778.65335: checking to see if all hosts have failed and the running result is not ok 30583 1726853778.65336: done checking to see if all hosts have failed 30583 1726853778.65337: getting the remaining hosts for this loop 30583 1726853778.65338: done getting the remaining hosts for this loop 30583 1726853778.65341: getting the next task for host managed_node2 30583 1726853778.65346: done getting next task for host managed_node2 30583 1726853778.65349: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853778.65352: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853778.65367: getting variables 30583 1726853778.65368: in VariableManager get_vars() 30583 1726853778.65385: Calling all_inventory to load vars for managed_node2 30583 1726853778.65387: Calling groups_inventory to load vars for managed_node2 30583 1726853778.65389: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853778.65393: Calling all_plugins_play to load vars for managed_node2 30583 1726853778.65395: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853778.65398: Calling groups_plugins_play to load vars for managed_node2 30583 1726853778.66538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853778.68096: done with get_vars() 30583 1726853778.68120: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:36:18 -0400 (0:00:00.088) 0:01:54.019 ****** 30583 1726853778.68205: entering _queue_task() for managed_node2/setup 30583 1726853778.68542: worker is 1 (out of 1 available) 30583 1726853778.68564: exiting _queue_task() for managed_node2/setup 30583 1726853778.68579: done queuing things up, now waiting for results queue to drain 30583 1726853778.68580: waiting for pending results... 30583 1726853778.68773: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853778.68872: in run() - task 02083763-bbaf-05ea-abc5-00000000237f 30583 1726853778.68884: variable 'ansible_search_path' from source: unknown 30583 1726853778.68888: variable 'ansible_search_path' from source: unknown 30583 1726853778.68917: calling self._execute() 30583 1726853778.68993: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.68997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.69006: variable 'omit' from source: magic vars 30583 1726853778.69293: variable 'ansible_distribution_major_version' from source: facts 30583 1726853778.69302: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853778.69450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853778.71285: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853778.71328: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853778.71362: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853778.71388: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853778.71407: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853778.71473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853778.71494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853778.71510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853778.71535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853778.71547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853778.71588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853778.71603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853778.71619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853778.71643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853778.71654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853778.71764: variable '__network_required_facts' from source: role '' defaults 30583 1726853778.71777: variable 'ansible_facts' from source: unknown 30583 1726853778.72229: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853778.72233: when evaluation is False, skipping this task 30583 1726853778.72236: _execute() done 30583 1726853778.72238: dumping result to json 30583 1726853778.72240: done dumping result, returning 30583 1726853778.72248: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-00000000237f] 30583 1726853778.72251: sending task result for task 02083763-bbaf-05ea-abc5-00000000237f 30583 1726853778.72338: done sending task result for task 02083763-bbaf-05ea-abc5-00000000237f 30583 1726853778.72340: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853778.72387: no more pending results, returning what we have 30583 1726853778.72390: results queue empty 30583 1726853778.72391: checking for any_errors_fatal 30583 1726853778.72393: done checking for any_errors_fatal 30583 1726853778.72393: checking for max_fail_percentage 30583 1726853778.72395: done checking for max_fail_percentage 30583 1726853778.72396: checking to see if all hosts have failed and the running result is not ok 30583 1726853778.72397: done checking to see if all hosts have failed 30583 1726853778.72398: getting the remaining hosts for this loop 30583 1726853778.72400: done getting the remaining hosts for this loop 30583 1726853778.72403: getting the next task for host managed_node2 30583 1726853778.72415: done getting next task for host managed_node2 30583 1726853778.72419: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853778.72425: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853778.72450: getting variables 30583 1726853778.72452: in VariableManager get_vars() 30583 1726853778.72500: Calling all_inventory to load vars for managed_node2 30583 1726853778.72504: Calling groups_inventory to load vars for managed_node2 30583 1726853778.72506: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853778.72515: Calling all_plugins_play to load vars for managed_node2 30583 1726853778.72518: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853778.72526: Calling groups_plugins_play to load vars for managed_node2 30583 1726853778.73519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853778.74833: done with get_vars() 30583 1726853778.74850: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:36:18 -0400 (0:00:00.067) 0:01:54.086 ****** 30583 1726853778.74925: entering _queue_task() for managed_node2/stat 30583 1726853778.75182: worker is 1 (out of 1 available) 30583 1726853778.75200: exiting _queue_task() for managed_node2/stat 30583 1726853778.75213: done queuing things up, now waiting for results queue to drain 30583 1726853778.75214: waiting for pending results... 30583 1726853778.75401: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853778.75498: in run() - task 02083763-bbaf-05ea-abc5-000000002381 30583 1726853778.75509: variable 'ansible_search_path' from source: unknown 30583 1726853778.75512: variable 'ansible_search_path' from source: unknown 30583 1726853778.75541: calling self._execute() 30583 1726853778.75619: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.75624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.75632: variable 'omit' from source: magic vars 30583 1726853778.75914: variable 'ansible_distribution_major_version' from source: facts 30583 1726853778.75923: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853778.76037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853778.76480: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853778.76484: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853778.76487: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853778.76489: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853778.76491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853778.76509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853778.76533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853778.76557: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853778.76720: variable '__network_is_ostree' from source: set_fact 30583 1726853778.76723: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853778.76725: when evaluation is False, skipping this task 30583 1726853778.76727: _execute() done 30583 1726853778.76729: dumping result to json 30583 1726853778.76730: done dumping result, returning 30583 1726853778.76732: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-000000002381] 30583 1726853778.76734: sending task result for task 02083763-bbaf-05ea-abc5-000000002381 30583 1726853778.76797: done sending task result for task 02083763-bbaf-05ea-abc5-000000002381 30583 1726853778.76799: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853778.76875: no more pending results, returning what we have 30583 1726853778.76879: results queue empty 30583 1726853778.76880: checking for any_errors_fatal 30583 1726853778.76885: done checking for any_errors_fatal 30583 1726853778.76886: checking for max_fail_percentage 30583 1726853778.76887: done checking for max_fail_percentage 30583 1726853778.76888: checking to see if all hosts have failed and the running result is not ok 30583 1726853778.76889: done checking to see if all hosts have failed 30583 1726853778.76889: getting the remaining hosts for this loop 30583 1726853778.76891: done getting the remaining hosts for this loop 30583 1726853778.76894: getting the next task for host managed_node2 30583 1726853778.76902: done getting next task for host managed_node2 30583 1726853778.76905: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853778.76911: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853778.76936: getting variables 30583 1726853778.76938: in VariableManager get_vars() 30583 1726853778.76976: Calling all_inventory to load vars for managed_node2 30583 1726853778.76979: Calling groups_inventory to load vars for managed_node2 30583 1726853778.76981: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853778.76989: Calling all_plugins_play to load vars for managed_node2 30583 1726853778.76992: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853778.76994: Calling groups_plugins_play to load vars for managed_node2 30583 1726853778.78382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853778.79246: done with get_vars() 30583 1726853778.79262: done getting variables 30583 1726853778.79306: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:36:18 -0400 (0:00:00.044) 0:01:54.130 ****** 30583 1726853778.79335: entering _queue_task() for managed_node2/set_fact 30583 1726853778.79593: worker is 1 (out of 1 available) 30583 1726853778.79607: exiting _queue_task() for managed_node2/set_fact 30583 1726853778.79622: done queuing things up, now waiting for results queue to drain 30583 1726853778.79623: waiting for pending results... 30583 1726853778.79828: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853778.79948: in run() - task 02083763-bbaf-05ea-abc5-000000002382 30583 1726853778.79964: variable 'ansible_search_path' from source: unknown 30583 1726853778.79967: variable 'ansible_search_path' from source: unknown 30583 1726853778.79996: calling self._execute() 30583 1726853778.80139: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.80143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.80146: variable 'omit' from source: magic vars 30583 1726853778.80576: variable 'ansible_distribution_major_version' from source: facts 30583 1726853778.80580: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853778.80709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853778.81011: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853778.81051: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853778.81081: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853778.81107: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853778.81176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853778.81195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853778.81212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853778.81239: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853778.81311: variable '__network_is_ostree' from source: set_fact 30583 1726853778.81317: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853778.81320: when evaluation is False, skipping this task 30583 1726853778.81322: _execute() done 30583 1726853778.81325: dumping result to json 30583 1726853778.81329: done dumping result, returning 30583 1726853778.81337: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-000000002382] 30583 1726853778.81339: sending task result for task 02083763-bbaf-05ea-abc5-000000002382 30583 1726853778.81425: done sending task result for task 02083763-bbaf-05ea-abc5-000000002382 30583 1726853778.81427: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853778.81511: no more pending results, returning what we have 30583 1726853778.81514: results queue empty 30583 1726853778.81515: checking for any_errors_fatal 30583 1726853778.81522: done checking for any_errors_fatal 30583 1726853778.81523: checking for max_fail_percentage 30583 1726853778.81525: done checking for max_fail_percentage 30583 1726853778.81525: checking to see if all hosts have failed and the running result is not ok 30583 1726853778.81526: done checking to see if all hosts have failed 30583 1726853778.81527: getting the remaining hosts for this loop 30583 1726853778.81529: done getting the remaining hosts for this loop 30583 1726853778.81532: getting the next task for host managed_node2 30583 1726853778.81543: done getting next task for host managed_node2 30583 1726853778.81546: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853778.81552: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853778.81573: getting variables 30583 1726853778.81576: in VariableManager get_vars() 30583 1726853778.81612: Calling all_inventory to load vars for managed_node2 30583 1726853778.81614: Calling groups_inventory to load vars for managed_node2 30583 1726853778.81616: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853778.81623: Calling all_plugins_play to load vars for managed_node2 30583 1726853778.81626: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853778.81628: Calling groups_plugins_play to load vars for managed_node2 30583 1726853778.82411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853778.83284: done with get_vars() 30583 1726853778.83301: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:36:18 -0400 (0:00:00.040) 0:01:54.170 ****** 30583 1726853778.83368: entering _queue_task() for managed_node2/service_facts 30583 1726853778.83611: worker is 1 (out of 1 available) 30583 1726853778.83626: exiting _queue_task() for managed_node2/service_facts 30583 1726853778.83641: done queuing things up, now waiting for results queue to drain 30583 1726853778.83642: waiting for pending results... 30583 1726853778.83837: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853778.83935: in run() - task 02083763-bbaf-05ea-abc5-000000002384 30583 1726853778.83946: variable 'ansible_search_path' from source: unknown 30583 1726853778.83949: variable 'ansible_search_path' from source: unknown 30583 1726853778.83983: calling self._execute() 30583 1726853778.84058: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.84065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.84074: variable 'omit' from source: magic vars 30583 1726853778.84359: variable 'ansible_distribution_major_version' from source: facts 30583 1726853778.84370: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853778.84377: variable 'omit' from source: magic vars 30583 1726853778.84435: variable 'omit' from source: magic vars 30583 1726853778.84459: variable 'omit' from source: magic vars 30583 1726853778.84494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853778.84523: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853778.84539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853778.84552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853778.84565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853778.84590: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853778.84593: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.84596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.84669: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853778.84675: Set connection var ansible_timeout to 10 30583 1726853778.84678: Set connection var ansible_connection to ssh 30583 1726853778.84684: Set connection var ansible_shell_executable to /bin/sh 30583 1726853778.84687: Set connection var ansible_shell_type to sh 30583 1726853778.84694: Set connection var ansible_pipelining to False 30583 1726853778.84712: variable 'ansible_shell_executable' from source: unknown 30583 1726853778.84715: variable 'ansible_connection' from source: unknown 30583 1726853778.84718: variable 'ansible_module_compression' from source: unknown 30583 1726853778.84720: variable 'ansible_shell_type' from source: unknown 30583 1726853778.84723: variable 'ansible_shell_executable' from source: unknown 30583 1726853778.84725: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853778.84729: variable 'ansible_pipelining' from source: unknown 30583 1726853778.84737: variable 'ansible_timeout' from source: unknown 30583 1726853778.84739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853778.84886: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853778.84896: variable 'omit' from source: magic vars 30583 1726853778.84900: starting attempt loop 30583 1726853778.84903: running the handler 30583 1726853778.84916: _low_level_execute_command(): starting 30583 1726853778.84922: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853778.85453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853778.85457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.85460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853778.85463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.85518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853778.85522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853778.85524: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853778.85603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853778.87328: stdout chunk (state=3): >>>/root <<< 30583 1726853778.87431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853778.87466: stderr chunk (state=3): >>><<< 30583 1726853778.87469: stdout chunk (state=3): >>><<< 30583 1726853778.87493: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853778.87503: _low_level_execute_command(): starting 30583 1726853778.87509: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932 `" && echo ansible-tmp-1726853778.8749218-35811-179843905009932="` echo /root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932 `" ) && sleep 0' 30583 1726853778.87938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853778.87946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853778.87977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.87988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853778.87990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853778.87993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.88038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853778.88041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853778.88118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853778.90117: stdout chunk (state=3): >>>ansible-tmp-1726853778.8749218-35811-179843905009932=/root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932 <<< 30583 1726853778.90225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853778.90256: stderr chunk (state=3): >>><<< 30583 1726853778.90262: stdout chunk (state=3): >>><<< 30583 1726853778.90275: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853778.8749218-35811-179843905009932=/root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853778.90316: variable 'ansible_module_compression' from source: unknown 30583 1726853778.90351: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853778.90388: variable 'ansible_facts' from source: unknown 30583 1726853778.90439: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932/AnsiballZ_service_facts.py 30583 1726853778.90540: Sending initial data 30583 1726853778.90543: Sent initial data (162 bytes) 30583 1726853778.90975: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853778.91005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853778.91009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.91012: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853778.91014: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853778.91016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.91070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853778.91076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853778.91141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853778.92806: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853778.92810: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853778.92875: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853778.92943: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpxqeb5q7z /root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932/AnsiballZ_service_facts.py <<< 30583 1726853778.92949: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932/AnsiballZ_service_facts.py" <<< 30583 1726853778.93010: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpxqeb5q7z" to remote "/root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932/AnsiballZ_service_facts.py" <<< 30583 1726853778.93014: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932/AnsiballZ_service_facts.py" <<< 30583 1726853778.93674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853778.93715: stderr chunk (state=3): >>><<< 30583 1726853778.93718: stdout chunk (state=3): >>><<< 30583 1726853778.93738: done transferring module to remote 30583 1726853778.93747: _low_level_execute_command(): starting 30583 1726853778.93754: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932/ /root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932/AnsiballZ_service_facts.py && sleep 0' 30583 1726853778.94207: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853778.94210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853778.94212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.94214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853778.94220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.94274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853778.94279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853778.94350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853778.96281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853778.96314: stderr chunk (state=3): >>><<< 30583 1726853778.96316: stdout chunk (state=3): >>><<< 30583 1726853778.96328: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853778.96331: _low_level_execute_command(): starting 30583 1726853778.96336: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932/AnsiballZ_service_facts.py && sleep 0' 30583 1726853778.96786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853778.96789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853778.96792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853778.96794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853778.96798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853778.96847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853778.96854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853778.96858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853778.96932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853780.59308: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 30583 1726853780.59327: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 30583 1726853780.59358: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853780.60977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853780.61012: stderr chunk (state=3): >>><<< 30583 1726853780.61015: stdout chunk (state=3): >>><<< 30583 1726853780.61041: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853780.61742: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853780.61751: _low_level_execute_command(): starting 30583 1726853780.61757: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853778.8749218-35811-179843905009932/ > /dev/null 2>&1 && sleep 0' 30583 1726853780.62221: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853780.62224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853780.62226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853780.62228: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853780.62230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853780.62277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853780.62290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853780.62375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853780.64320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853780.64347: stderr chunk (state=3): >>><<< 30583 1726853780.64350: stdout chunk (state=3): >>><<< 30583 1726853780.64365: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853780.64373: handler run complete 30583 1726853780.64490: variable 'ansible_facts' from source: unknown 30583 1726853780.64592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853780.64881: variable 'ansible_facts' from source: unknown 30583 1726853780.64962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853780.65080: attempt loop complete, returning result 30583 1726853780.65084: _execute() done 30583 1726853780.65086: dumping result to json 30583 1726853780.65122: done dumping result, returning 30583 1726853780.65130: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-000000002384] 30583 1726853780.65135: sending task result for task 02083763-bbaf-05ea-abc5-000000002384 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853780.65987: no more pending results, returning what we have 30583 1726853780.65991: results queue empty 30583 1726853780.65992: checking for any_errors_fatal 30583 1726853780.65995: done checking for any_errors_fatal 30583 1726853780.65996: checking for max_fail_percentage 30583 1726853780.66002: done checking for max_fail_percentage 30583 1726853780.66003: checking to see if all hosts have failed and the running result is not ok 30583 1726853780.66003: done checking to see if all hosts have failed 30583 1726853780.66004: getting the remaining hosts for this loop 30583 1726853780.66005: done getting the remaining hosts for this loop 30583 1726853780.66009: getting the next task for host managed_node2 30583 1726853780.66016: done getting next task for host managed_node2 30583 1726853780.66019: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853780.66026: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853780.66036: done sending task result for task 02083763-bbaf-05ea-abc5-000000002384 30583 1726853780.66039: WORKER PROCESS EXITING 30583 1726853780.66047: getting variables 30583 1726853780.66048: in VariableManager get_vars() 30583 1726853780.66080: Calling all_inventory to load vars for managed_node2 30583 1726853780.66083: Calling groups_inventory to load vars for managed_node2 30583 1726853780.66085: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853780.66093: Calling all_plugins_play to load vars for managed_node2 30583 1726853780.66096: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853780.66109: Calling groups_plugins_play to load vars for managed_node2 30583 1726853780.67350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853780.68333: done with get_vars() 30583 1726853780.68357: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:36:20 -0400 (0:00:01.850) 0:01:56.021 ****** 30583 1726853780.68433: entering _queue_task() for managed_node2/package_facts 30583 1726853780.68702: worker is 1 (out of 1 available) 30583 1726853780.68717: exiting _queue_task() for managed_node2/package_facts 30583 1726853780.68732: done queuing things up, now waiting for results queue to drain 30583 1726853780.68733: waiting for pending results... 30583 1726853780.68933: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853780.69043: in run() - task 02083763-bbaf-05ea-abc5-000000002385 30583 1726853780.69055: variable 'ansible_search_path' from source: unknown 30583 1726853780.69059: variable 'ansible_search_path' from source: unknown 30583 1726853780.69092: calling self._execute() 30583 1726853780.69167: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853780.69176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853780.69184: variable 'omit' from source: magic vars 30583 1726853780.69474: variable 'ansible_distribution_major_version' from source: facts 30583 1726853780.69483: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853780.69489: variable 'omit' from source: magic vars 30583 1726853780.69546: variable 'omit' from source: magic vars 30583 1726853780.69575: variable 'omit' from source: magic vars 30583 1726853780.69608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853780.69638: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853780.69655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853780.69672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853780.69682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853780.69706: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853780.69709: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853780.69712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853780.69785: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853780.69789: Set connection var ansible_timeout to 10 30583 1726853780.69792: Set connection var ansible_connection to ssh 30583 1726853780.69797: Set connection var ansible_shell_executable to /bin/sh 30583 1726853780.69800: Set connection var ansible_shell_type to sh 30583 1726853780.69808: Set connection var ansible_pipelining to False 30583 1726853780.69826: variable 'ansible_shell_executable' from source: unknown 30583 1726853780.69830: variable 'ansible_connection' from source: unknown 30583 1726853780.69833: variable 'ansible_module_compression' from source: unknown 30583 1726853780.69836: variable 'ansible_shell_type' from source: unknown 30583 1726853780.69839: variable 'ansible_shell_executable' from source: unknown 30583 1726853780.69841: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853780.69843: variable 'ansible_pipelining' from source: unknown 30583 1726853780.69845: variable 'ansible_timeout' from source: unknown 30583 1726853780.69847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853780.69993: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853780.70003: variable 'omit' from source: magic vars 30583 1726853780.70008: starting attempt loop 30583 1726853780.70010: running the handler 30583 1726853780.70022: _low_level_execute_command(): starting 30583 1726853780.70028: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853780.70550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853780.70554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853780.70560: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853780.70563: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853780.70613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853780.70616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853780.70618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853780.70700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853780.72427: stdout chunk (state=3): >>>/root <<< 30583 1726853780.72528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853780.72556: stderr chunk (state=3): >>><<< 30583 1726853780.72562: stdout chunk (state=3): >>><<< 30583 1726853780.72585: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853780.72594: _low_level_execute_command(): starting 30583 1726853780.72599: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871 `" && echo ansible-tmp-1726853780.7258208-35842-61620673195871="` echo /root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871 `" ) && sleep 0' 30583 1726853780.73045: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853780.73048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853780.73060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853780.73063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853780.73065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853780.73108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853780.73111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853780.73115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853780.73184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853780.75161: stdout chunk (state=3): >>>ansible-tmp-1726853780.7258208-35842-61620673195871=/root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871 <<< 30583 1726853780.75264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853780.75295: stderr chunk (state=3): >>><<< 30583 1726853780.75299: stdout chunk (state=3): >>><<< 30583 1726853780.75312: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853780.7258208-35842-61620673195871=/root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853780.75352: variable 'ansible_module_compression' from source: unknown 30583 1726853780.75395: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853780.75449: variable 'ansible_facts' from source: unknown 30583 1726853780.75569: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871/AnsiballZ_package_facts.py 30583 1726853780.75676: Sending initial data 30583 1726853780.75680: Sent initial data (161 bytes) 30583 1726853780.76130: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853780.76133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853780.76135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853780.76137: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853780.76139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853780.76189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853780.76193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853780.76274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853780.77897: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853780.77901: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853780.77964: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853780.78037: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpy3h5ahsn /root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871/AnsiballZ_package_facts.py <<< 30583 1726853780.78043: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871/AnsiballZ_package_facts.py" <<< 30583 1726853780.78106: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpy3h5ahsn" to remote "/root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871/AnsiballZ_package_facts.py" <<< 30583 1726853780.79299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853780.79338: stderr chunk (state=3): >>><<< 30583 1726853780.79341: stdout chunk (state=3): >>><<< 30583 1726853780.79381: done transferring module to remote 30583 1726853780.79392: _low_level_execute_command(): starting 30583 1726853780.79395: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871/ /root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871/AnsiballZ_package_facts.py && sleep 0' 30583 1726853780.79833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853780.79837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853780.79840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853780.79846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853780.79895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853780.79898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853780.79975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853780.81846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853780.81876: stderr chunk (state=3): >>><<< 30583 1726853780.81879: stdout chunk (state=3): >>><<< 30583 1726853780.81892: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853780.81895: _low_level_execute_command(): starting 30583 1726853780.81900: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871/AnsiballZ_package_facts.py && sleep 0' 30583 1726853780.82327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853780.82331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853780.82345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853780.82404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853780.82413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853780.82416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853780.82488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853781.27291: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30583 1726853781.27318: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30583 1726853781.27338: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30583 1726853781.27350: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30583 1726853781.27370: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30583 1726853781.27388: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30583 1726853781.27409: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30583 1726853781.27413: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30583 1726853781.27437: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30583 1726853781.27452: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30583 1726853781.27460: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30583 1726853781.27481: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30583 1726853781.27492: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853781.29225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853781.29252: stderr chunk (state=3): >>><<< 30583 1726853781.29255: stdout chunk (state=3): >>><<< 30583 1726853781.29298: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853781.30524: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853781.30541: _low_level_execute_command(): starting 30583 1726853781.30546: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853780.7258208-35842-61620673195871/ > /dev/null 2>&1 && sleep 0' 30583 1726853781.31001: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853781.31004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853781.31007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853781.31009: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853781.31011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853781.31065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853781.31068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853781.31070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853781.31154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853781.33063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853781.33091: stderr chunk (state=3): >>><<< 30583 1726853781.33094: stdout chunk (state=3): >>><<< 30583 1726853781.33108: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853781.33114: handler run complete 30583 1726853781.33646: variable 'ansible_facts' from source: unknown 30583 1726853781.33919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853781.34962: variable 'ansible_facts' from source: unknown 30583 1726853781.35202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853781.35582: attempt loop complete, returning result 30583 1726853781.35592: _execute() done 30583 1726853781.35594: dumping result to json 30583 1726853781.35711: done dumping result, returning 30583 1726853781.35719: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-000000002385] 30583 1726853781.35722: sending task result for task 02083763-bbaf-05ea-abc5-000000002385 30583 1726853781.37091: done sending task result for task 02083763-bbaf-05ea-abc5-000000002385 30583 1726853781.37095: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853781.37201: no more pending results, returning what we have 30583 1726853781.37203: results queue empty 30583 1726853781.37203: checking for any_errors_fatal 30583 1726853781.37207: done checking for any_errors_fatal 30583 1726853781.37208: checking for max_fail_percentage 30583 1726853781.37209: done checking for max_fail_percentage 30583 1726853781.37209: checking to see if all hosts have failed and the running result is not ok 30583 1726853781.37210: done checking to see if all hosts have failed 30583 1726853781.37210: getting the remaining hosts for this loop 30583 1726853781.37211: done getting the remaining hosts for this loop 30583 1726853781.37213: getting the next task for host managed_node2 30583 1726853781.37221: done getting next task for host managed_node2 30583 1726853781.37223: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853781.37227: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853781.37235: getting variables 30583 1726853781.37236: in VariableManager get_vars() 30583 1726853781.37260: Calling all_inventory to load vars for managed_node2 30583 1726853781.37262: Calling groups_inventory to load vars for managed_node2 30583 1726853781.37264: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853781.37270: Calling all_plugins_play to load vars for managed_node2 30583 1726853781.37274: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853781.37276: Calling groups_plugins_play to load vars for managed_node2 30583 1726853781.37959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853781.38818: done with get_vars() 30583 1726853781.38840: done getting variables 30583 1726853781.38885: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:36:21 -0400 (0:00:00.704) 0:01:56.726 ****** 30583 1726853781.38910: entering _queue_task() for managed_node2/debug 30583 1726853781.39157: worker is 1 (out of 1 available) 30583 1726853781.39172: exiting _queue_task() for managed_node2/debug 30583 1726853781.39186: done queuing things up, now waiting for results queue to drain 30583 1726853781.39188: waiting for pending results... 30583 1726853781.39375: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853781.39461: in run() - task 02083763-bbaf-05ea-abc5-000000002329 30583 1726853781.39475: variable 'ansible_search_path' from source: unknown 30583 1726853781.39479: variable 'ansible_search_path' from source: unknown 30583 1726853781.39507: calling self._execute() 30583 1726853781.39586: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853781.39590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853781.39598: variable 'omit' from source: magic vars 30583 1726853781.39892: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.39901: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853781.39907: variable 'omit' from source: magic vars 30583 1726853781.39953: variable 'omit' from source: magic vars 30583 1726853781.40021: variable 'network_provider' from source: set_fact 30583 1726853781.40036: variable 'omit' from source: magic vars 30583 1726853781.40073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853781.40100: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853781.40115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853781.40128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853781.40138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853781.40164: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853781.40167: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853781.40170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853781.40241: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853781.40246: Set connection var ansible_timeout to 10 30583 1726853781.40249: Set connection var ansible_connection to ssh 30583 1726853781.40254: Set connection var ansible_shell_executable to /bin/sh 30583 1726853781.40256: Set connection var ansible_shell_type to sh 30583 1726853781.40266: Set connection var ansible_pipelining to False 30583 1726853781.40287: variable 'ansible_shell_executable' from source: unknown 30583 1726853781.40290: variable 'ansible_connection' from source: unknown 30583 1726853781.40293: variable 'ansible_module_compression' from source: unknown 30583 1726853781.40295: variable 'ansible_shell_type' from source: unknown 30583 1726853781.40297: variable 'ansible_shell_executable' from source: unknown 30583 1726853781.40299: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853781.40304: variable 'ansible_pipelining' from source: unknown 30583 1726853781.40306: variable 'ansible_timeout' from source: unknown 30583 1726853781.40310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853781.40413: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853781.40422: variable 'omit' from source: magic vars 30583 1726853781.40427: starting attempt loop 30583 1726853781.40430: running the handler 30583 1726853781.40469: handler run complete 30583 1726853781.40482: attempt loop complete, returning result 30583 1726853781.40485: _execute() done 30583 1726853781.40488: dumping result to json 30583 1726853781.40490: done dumping result, returning 30583 1726853781.40498: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-000000002329] 30583 1726853781.40500: sending task result for task 02083763-bbaf-05ea-abc5-000000002329 30583 1726853781.40577: done sending task result for task 02083763-bbaf-05ea-abc5-000000002329 30583 1726853781.40580: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853781.40680: no more pending results, returning what we have 30583 1726853781.40683: results queue empty 30583 1726853781.40684: checking for any_errors_fatal 30583 1726853781.40691: done checking for any_errors_fatal 30583 1726853781.40692: checking for max_fail_percentage 30583 1726853781.40693: done checking for max_fail_percentage 30583 1726853781.40694: checking to see if all hosts have failed and the running result is not ok 30583 1726853781.40695: done checking to see if all hosts have failed 30583 1726853781.40695: getting the remaining hosts for this loop 30583 1726853781.40697: done getting the remaining hosts for this loop 30583 1726853781.40700: getting the next task for host managed_node2 30583 1726853781.40708: done getting next task for host managed_node2 30583 1726853781.40712: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853781.40716: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853781.40728: getting variables 30583 1726853781.40730: in VariableManager get_vars() 30583 1726853781.40765: Calling all_inventory to load vars for managed_node2 30583 1726853781.40768: Calling groups_inventory to load vars for managed_node2 30583 1726853781.40770: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853781.40783: Calling all_plugins_play to load vars for managed_node2 30583 1726853781.40785: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853781.40788: Calling groups_plugins_play to load vars for managed_node2 30583 1726853781.41619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853781.42483: done with get_vars() 30583 1726853781.42499: done getting variables 30583 1726853781.42539: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:36:21 -0400 (0:00:00.036) 0:01:56.762 ****** 30583 1726853781.42568: entering _queue_task() for managed_node2/fail 30583 1726853781.42796: worker is 1 (out of 1 available) 30583 1726853781.42810: exiting _queue_task() for managed_node2/fail 30583 1726853781.42823: done queuing things up, now waiting for results queue to drain 30583 1726853781.42824: waiting for pending results... 30583 1726853781.43011: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853781.43112: in run() - task 02083763-bbaf-05ea-abc5-00000000232a 30583 1726853781.43123: variable 'ansible_search_path' from source: unknown 30583 1726853781.43126: variable 'ansible_search_path' from source: unknown 30583 1726853781.43155: calling self._execute() 30583 1726853781.43229: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853781.43233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853781.43242: variable 'omit' from source: magic vars 30583 1726853781.43523: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.43533: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853781.43621: variable 'network_state' from source: role '' defaults 30583 1726853781.43629: Evaluated conditional (network_state != {}): False 30583 1726853781.43633: when evaluation is False, skipping this task 30583 1726853781.43635: _execute() done 30583 1726853781.43638: dumping result to json 30583 1726853781.43641: done dumping result, returning 30583 1726853781.43647: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-00000000232a] 30583 1726853781.43650: sending task result for task 02083763-bbaf-05ea-abc5-00000000232a 30583 1726853781.43735: done sending task result for task 02083763-bbaf-05ea-abc5-00000000232a 30583 1726853781.43738: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853781.43787: no more pending results, returning what we have 30583 1726853781.43791: results queue empty 30583 1726853781.43792: checking for any_errors_fatal 30583 1726853781.43798: done checking for any_errors_fatal 30583 1726853781.43798: checking for max_fail_percentage 30583 1726853781.43800: done checking for max_fail_percentage 30583 1726853781.43801: checking to see if all hosts have failed and the running result is not ok 30583 1726853781.43801: done checking to see if all hosts have failed 30583 1726853781.43802: getting the remaining hosts for this loop 30583 1726853781.43805: done getting the remaining hosts for this loop 30583 1726853781.43808: getting the next task for host managed_node2 30583 1726853781.43815: done getting next task for host managed_node2 30583 1726853781.43819: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853781.43824: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853781.43844: getting variables 30583 1726853781.43846: in VariableManager get_vars() 30583 1726853781.43887: Calling all_inventory to load vars for managed_node2 30583 1726853781.43890: Calling groups_inventory to load vars for managed_node2 30583 1726853781.43892: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853781.43899: Calling all_plugins_play to load vars for managed_node2 30583 1726853781.43902: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853781.43904: Calling groups_plugins_play to load vars for managed_node2 30583 1726853781.44654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853781.45620: done with get_vars() 30583 1726853781.45635: done getting variables 30583 1726853781.45678: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:36:21 -0400 (0:00:00.031) 0:01:56.794 ****** 30583 1726853781.45703: entering _queue_task() for managed_node2/fail 30583 1726853781.45923: worker is 1 (out of 1 available) 30583 1726853781.45937: exiting _queue_task() for managed_node2/fail 30583 1726853781.45951: done queuing things up, now waiting for results queue to drain 30583 1726853781.45952: waiting for pending results... 30583 1726853781.46134: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853781.46216: in run() - task 02083763-bbaf-05ea-abc5-00000000232b 30583 1726853781.46225: variable 'ansible_search_path' from source: unknown 30583 1726853781.46229: variable 'ansible_search_path' from source: unknown 30583 1726853781.46258: calling self._execute() 30583 1726853781.46338: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853781.46342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853781.46351: variable 'omit' from source: magic vars 30583 1726853781.46626: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.46636: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853781.46724: variable 'network_state' from source: role '' defaults 30583 1726853781.46729: Evaluated conditional (network_state != {}): False 30583 1726853781.46732: when evaluation is False, skipping this task 30583 1726853781.46735: _execute() done 30583 1726853781.46737: dumping result to json 30583 1726853781.46741: done dumping result, returning 30583 1726853781.46749: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-00000000232b] 30583 1726853781.46752: sending task result for task 02083763-bbaf-05ea-abc5-00000000232b 30583 1726853781.46844: done sending task result for task 02083763-bbaf-05ea-abc5-00000000232b 30583 1726853781.46847: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853781.46893: no more pending results, returning what we have 30583 1726853781.46897: results queue empty 30583 1726853781.46898: checking for any_errors_fatal 30583 1726853781.46906: done checking for any_errors_fatal 30583 1726853781.46907: checking for max_fail_percentage 30583 1726853781.46909: done checking for max_fail_percentage 30583 1726853781.46910: checking to see if all hosts have failed and the running result is not ok 30583 1726853781.46910: done checking to see if all hosts have failed 30583 1726853781.46911: getting the remaining hosts for this loop 30583 1726853781.46913: done getting the remaining hosts for this loop 30583 1726853781.46916: getting the next task for host managed_node2 30583 1726853781.46924: done getting next task for host managed_node2 30583 1726853781.46928: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853781.46933: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853781.46952: getting variables 30583 1726853781.46953: in VariableManager get_vars() 30583 1726853781.46989: Calling all_inventory to load vars for managed_node2 30583 1726853781.46992: Calling groups_inventory to load vars for managed_node2 30583 1726853781.46994: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853781.47002: Calling all_plugins_play to load vars for managed_node2 30583 1726853781.47004: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853781.47007: Calling groups_plugins_play to load vars for managed_node2 30583 1726853781.51913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853781.52751: done with get_vars() 30583 1726853781.52768: done getting variables 30583 1726853781.52804: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:36:21 -0400 (0:00:00.071) 0:01:56.865 ****** 30583 1726853781.52825: entering _queue_task() for managed_node2/fail 30583 1726853781.53097: worker is 1 (out of 1 available) 30583 1726853781.53110: exiting _queue_task() for managed_node2/fail 30583 1726853781.53123: done queuing things up, now waiting for results queue to drain 30583 1726853781.53124: waiting for pending results... 30583 1726853781.53322: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853781.53423: in run() - task 02083763-bbaf-05ea-abc5-00000000232c 30583 1726853781.53436: variable 'ansible_search_path' from source: unknown 30583 1726853781.53441: variable 'ansible_search_path' from source: unknown 30583 1726853781.53474: calling self._execute() 30583 1726853781.53552: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853781.53557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853781.53568: variable 'omit' from source: magic vars 30583 1726853781.53860: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.53873: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853781.54000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853781.55547: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853781.55603: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853781.55630: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853781.55658: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853781.55682: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853781.55741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.55765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.55786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.55810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.55821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.55899: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.55912: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853781.55993: variable 'ansible_distribution' from source: facts 30583 1726853781.55997: variable '__network_rh_distros' from source: role '' defaults 30583 1726853781.56004: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853781.56166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.56185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.56204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.56229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.56240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.56277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.56295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.56311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.56337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.56346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.56378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.56395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.56411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.56436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.56447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.56651: variable 'network_connections' from source: include params 30583 1726853781.56660: variable 'interface' from source: play vars 30583 1726853781.56709: variable 'interface' from source: play vars 30583 1726853781.56717: variable 'network_state' from source: role '' defaults 30583 1726853781.56767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853781.56895: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853781.56924: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853781.56948: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853781.56973: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853781.57004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853781.57020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853781.57043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.57060: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853781.57084: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853781.57087: when evaluation is False, skipping this task 30583 1726853781.57090: _execute() done 30583 1726853781.57092: dumping result to json 30583 1726853781.57094: done dumping result, returning 30583 1726853781.57104: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-00000000232c] 30583 1726853781.57107: sending task result for task 02083763-bbaf-05ea-abc5-00000000232c 30583 1726853781.57202: done sending task result for task 02083763-bbaf-05ea-abc5-00000000232c 30583 1726853781.57204: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853781.57248: no more pending results, returning what we have 30583 1726853781.57252: results queue empty 30583 1726853781.57253: checking for any_errors_fatal 30583 1726853781.57264: done checking for any_errors_fatal 30583 1726853781.57264: checking for max_fail_percentage 30583 1726853781.57266: done checking for max_fail_percentage 30583 1726853781.57267: checking to see if all hosts have failed and the running result is not ok 30583 1726853781.57268: done checking to see if all hosts have failed 30583 1726853781.57268: getting the remaining hosts for this loop 30583 1726853781.57273: done getting the remaining hosts for this loop 30583 1726853781.57276: getting the next task for host managed_node2 30583 1726853781.57284: done getting next task for host managed_node2 30583 1726853781.57288: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853781.57293: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853781.57318: getting variables 30583 1726853781.57320: in VariableManager get_vars() 30583 1726853781.57364: Calling all_inventory to load vars for managed_node2 30583 1726853781.57367: Calling groups_inventory to load vars for managed_node2 30583 1726853781.57369: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853781.57383: Calling all_plugins_play to load vars for managed_node2 30583 1726853781.57386: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853781.57389: Calling groups_plugins_play to load vars for managed_node2 30583 1726853781.58204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853781.59199: done with get_vars() 30583 1726853781.59216: done getting variables 30583 1726853781.59257: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:36:21 -0400 (0:00:00.064) 0:01:56.930 ****** 30583 1726853781.59284: entering _queue_task() for managed_node2/dnf 30583 1726853781.59530: worker is 1 (out of 1 available) 30583 1726853781.59546: exiting _queue_task() for managed_node2/dnf 30583 1726853781.59561: done queuing things up, now waiting for results queue to drain 30583 1726853781.59562: waiting for pending results... 30583 1726853781.59749: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853781.59867: in run() - task 02083763-bbaf-05ea-abc5-00000000232d 30583 1726853781.59879: variable 'ansible_search_path' from source: unknown 30583 1726853781.59884: variable 'ansible_search_path' from source: unknown 30583 1726853781.59915: calling self._execute() 30583 1726853781.59991: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853781.59994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853781.60007: variable 'omit' from source: magic vars 30583 1726853781.60279: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.60288: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853781.60419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853781.61919: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853781.61975: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853781.62001: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853781.62026: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853781.62046: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853781.62104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.62124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.62141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.62167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.62182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.62264: variable 'ansible_distribution' from source: facts 30583 1726853781.62267: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.62288: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853781.62353: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853781.62436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.62452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.62469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.62495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.62507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.62534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.62550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.62567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.62592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.62602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.62630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.62647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.62663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.62689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.62699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.62800: variable 'network_connections' from source: include params 30583 1726853781.62808: variable 'interface' from source: play vars 30583 1726853781.62853: variable 'interface' from source: play vars 30583 1726853781.62902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853781.63019: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853781.63046: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853781.63076: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853781.63100: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853781.63130: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853781.63146: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853781.63174: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.63192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853781.63226: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853781.63375: variable 'network_connections' from source: include params 30583 1726853781.63378: variable 'interface' from source: play vars 30583 1726853781.63421: variable 'interface' from source: play vars 30583 1726853781.63440: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853781.63443: when evaluation is False, skipping this task 30583 1726853781.63446: _execute() done 30583 1726853781.63449: dumping result to json 30583 1726853781.63451: done dumping result, returning 30583 1726853781.63458: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-00000000232d] 30583 1726853781.63464: sending task result for task 02083763-bbaf-05ea-abc5-00000000232d 30583 1726853781.63551: done sending task result for task 02083763-bbaf-05ea-abc5-00000000232d 30583 1726853781.63554: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853781.63602: no more pending results, returning what we have 30583 1726853781.63605: results queue empty 30583 1726853781.63606: checking for any_errors_fatal 30583 1726853781.63612: done checking for any_errors_fatal 30583 1726853781.63612: checking for max_fail_percentage 30583 1726853781.63614: done checking for max_fail_percentage 30583 1726853781.63615: checking to see if all hosts have failed and the running result is not ok 30583 1726853781.63616: done checking to see if all hosts have failed 30583 1726853781.63616: getting the remaining hosts for this loop 30583 1726853781.63618: done getting the remaining hosts for this loop 30583 1726853781.63622: getting the next task for host managed_node2 30583 1726853781.63629: done getting next task for host managed_node2 30583 1726853781.63633: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853781.63638: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853781.63660: getting variables 30583 1726853781.63662: in VariableManager get_vars() 30583 1726853781.63705: Calling all_inventory to load vars for managed_node2 30583 1726853781.63708: Calling groups_inventory to load vars for managed_node2 30583 1726853781.63710: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853781.63719: Calling all_plugins_play to load vars for managed_node2 30583 1726853781.63722: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853781.63725: Calling groups_plugins_play to load vars for managed_node2 30583 1726853781.64563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853781.65439: done with get_vars() 30583 1726853781.65456: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853781.65513: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:36:21 -0400 (0:00:00.062) 0:01:56.992 ****** 30583 1726853781.65538: entering _queue_task() for managed_node2/yum 30583 1726853781.65794: worker is 1 (out of 1 available) 30583 1726853781.65810: exiting _queue_task() for managed_node2/yum 30583 1726853781.65822: done queuing things up, now waiting for results queue to drain 30583 1726853781.65824: waiting for pending results... 30583 1726853781.66011: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853781.66104: in run() - task 02083763-bbaf-05ea-abc5-00000000232e 30583 1726853781.66115: variable 'ansible_search_path' from source: unknown 30583 1726853781.66119: variable 'ansible_search_path' from source: unknown 30583 1726853781.66146: calling self._execute() 30583 1726853781.66229: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853781.66233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853781.66244: variable 'omit' from source: magic vars 30583 1726853781.66533: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.66542: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853781.66666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853781.68779: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853781.69079: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853781.69107: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853781.69132: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853781.69152: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853781.69215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.69236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.69254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.69284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.69297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.69368: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.69383: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853781.69386: when evaluation is False, skipping this task 30583 1726853781.69389: _execute() done 30583 1726853781.69391: dumping result to json 30583 1726853781.69393: done dumping result, returning 30583 1726853781.69403: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-00000000232e] 30583 1726853781.69406: sending task result for task 02083763-bbaf-05ea-abc5-00000000232e 30583 1726853781.69498: done sending task result for task 02083763-bbaf-05ea-abc5-00000000232e 30583 1726853781.69501: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853781.69548: no more pending results, returning what we have 30583 1726853781.69551: results queue empty 30583 1726853781.69552: checking for any_errors_fatal 30583 1726853781.69557: done checking for any_errors_fatal 30583 1726853781.69558: checking for max_fail_percentage 30583 1726853781.69560: done checking for max_fail_percentage 30583 1726853781.69561: checking to see if all hosts have failed and the running result is not ok 30583 1726853781.69562: done checking to see if all hosts have failed 30583 1726853781.69562: getting the remaining hosts for this loop 30583 1726853781.69564: done getting the remaining hosts for this loop 30583 1726853781.69567: getting the next task for host managed_node2 30583 1726853781.69577: done getting next task for host managed_node2 30583 1726853781.69581: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853781.69586: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853781.69610: getting variables 30583 1726853781.69611: in VariableManager get_vars() 30583 1726853781.69654: Calling all_inventory to load vars for managed_node2 30583 1726853781.69657: Calling groups_inventory to load vars for managed_node2 30583 1726853781.69659: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853781.69668: Calling all_plugins_play to load vars for managed_node2 30583 1726853781.69676: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853781.69679: Calling groups_plugins_play to load vars for managed_node2 30583 1726853781.70678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853781.71527: done with get_vars() 30583 1726853781.71543: done getting variables 30583 1726853781.71590: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:36:21 -0400 (0:00:00.060) 0:01:57.053 ****** 30583 1726853781.71616: entering _queue_task() for managed_node2/fail 30583 1726853781.71879: worker is 1 (out of 1 available) 30583 1726853781.71894: exiting _queue_task() for managed_node2/fail 30583 1726853781.71905: done queuing things up, now waiting for results queue to drain 30583 1726853781.71907: waiting for pending results... 30583 1726853781.72103: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853781.72210: in run() - task 02083763-bbaf-05ea-abc5-00000000232f 30583 1726853781.72221: variable 'ansible_search_path' from source: unknown 30583 1726853781.72225: variable 'ansible_search_path' from source: unknown 30583 1726853781.72255: calling self._execute() 30583 1726853781.72336: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853781.72341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853781.72352: variable 'omit' from source: magic vars 30583 1726853781.72627: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.72637: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853781.72724: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853781.72856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853781.74364: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853781.74420: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853781.74445: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853781.74472: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853781.74493: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853781.74551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.74574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.74593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.74617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.74633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.74747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.74751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.74753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.74756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.74761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.74763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.74766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.74781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.74804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.74815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.74929: variable 'network_connections' from source: include params 30583 1726853781.74939: variable 'interface' from source: play vars 30583 1726853781.74989: variable 'interface' from source: play vars 30583 1726853781.75038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853781.75145: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853781.75186: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853781.75209: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853781.75229: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853781.75261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853781.75277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853781.75296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.75314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853781.75351: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853781.75512: variable 'network_connections' from source: include params 30583 1726853781.75515: variable 'interface' from source: play vars 30583 1726853781.75555: variable 'interface' from source: play vars 30583 1726853781.75575: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853781.75579: when evaluation is False, skipping this task 30583 1726853781.75581: _execute() done 30583 1726853781.75584: dumping result to json 30583 1726853781.75586: done dumping result, returning 30583 1726853781.75594: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-00000000232f] 30583 1726853781.75598: sending task result for task 02083763-bbaf-05ea-abc5-00000000232f 30583 1726853781.75691: done sending task result for task 02083763-bbaf-05ea-abc5-00000000232f 30583 1726853781.75694: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853781.75777: no more pending results, returning what we have 30583 1726853781.75781: results queue empty 30583 1726853781.75782: checking for any_errors_fatal 30583 1726853781.75790: done checking for any_errors_fatal 30583 1726853781.75791: checking for max_fail_percentage 30583 1726853781.75793: done checking for max_fail_percentage 30583 1726853781.75794: checking to see if all hosts have failed and the running result is not ok 30583 1726853781.75794: done checking to see if all hosts have failed 30583 1726853781.75795: getting the remaining hosts for this loop 30583 1726853781.75797: done getting the remaining hosts for this loop 30583 1726853781.75800: getting the next task for host managed_node2 30583 1726853781.75808: done getting next task for host managed_node2 30583 1726853781.75812: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853781.75817: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853781.75840: getting variables 30583 1726853781.75842: in VariableManager get_vars() 30583 1726853781.75887: Calling all_inventory to load vars for managed_node2 30583 1726853781.75889: Calling groups_inventory to load vars for managed_node2 30583 1726853781.75892: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853781.75900: Calling all_plugins_play to load vars for managed_node2 30583 1726853781.75903: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853781.75905: Calling groups_plugins_play to load vars for managed_node2 30583 1726853781.76715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853781.77602: done with get_vars() 30583 1726853781.77623: done getting variables 30583 1726853781.77668: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:36:21 -0400 (0:00:00.060) 0:01:57.114 ****** 30583 1726853781.77700: entering _queue_task() for managed_node2/package 30583 1726853781.77972: worker is 1 (out of 1 available) 30583 1726853781.77987: exiting _queue_task() for managed_node2/package 30583 1726853781.78000: done queuing things up, now waiting for results queue to drain 30583 1726853781.78001: waiting for pending results... 30583 1726853781.78201: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853781.78306: in run() - task 02083763-bbaf-05ea-abc5-000000002330 30583 1726853781.78317: variable 'ansible_search_path' from source: unknown 30583 1726853781.78321: variable 'ansible_search_path' from source: unknown 30583 1726853781.78354: calling self._execute() 30583 1726853781.78435: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853781.78440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853781.78448: variable 'omit' from source: magic vars 30583 1726853781.78744: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.78752: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853781.78987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853781.79095: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853781.79130: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853781.79156: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853781.79219: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853781.79299: variable 'network_packages' from source: role '' defaults 30583 1726853781.79375: variable '__network_provider_setup' from source: role '' defaults 30583 1726853781.79384: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853781.79428: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853781.79476: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853781.79484: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853781.79601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853781.81209: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853781.81251: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853781.81285: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853781.81308: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853781.81327: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853781.81390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.81410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.81427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.81453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.81466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.81500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.81516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.81532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.81555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.81568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.81725: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853781.81802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.81820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.81837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.81862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.81874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.81936: variable 'ansible_python' from source: facts 30583 1726853781.81950: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853781.82009: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853781.82067: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853781.82150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.82170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.82188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.82212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.82222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.82256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.82279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.82295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.82319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.82329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.82427: variable 'network_connections' from source: include params 30583 1726853781.82430: variable 'interface' from source: play vars 30583 1726853781.82505: variable 'interface' from source: play vars 30583 1726853781.82552: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853781.82578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853781.82599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.82619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853781.82657: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853781.82839: variable 'network_connections' from source: include params 30583 1726853781.82842: variable 'interface' from source: play vars 30583 1726853781.82917: variable 'interface' from source: play vars 30583 1726853781.82939: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853781.82997: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853781.83191: variable 'network_connections' from source: include params 30583 1726853781.83194: variable 'interface' from source: play vars 30583 1726853781.83240: variable 'interface' from source: play vars 30583 1726853781.83256: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853781.83312: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853781.83508: variable 'network_connections' from source: include params 30583 1726853781.83511: variable 'interface' from source: play vars 30583 1726853781.83557: variable 'interface' from source: play vars 30583 1726853781.83595: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853781.83635: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853781.83642: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853781.83688: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853781.83820: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853781.84119: variable 'network_connections' from source: include params 30583 1726853781.84122: variable 'interface' from source: play vars 30583 1726853781.84165: variable 'interface' from source: play vars 30583 1726853781.84173: variable 'ansible_distribution' from source: facts 30583 1726853781.84176: variable '__network_rh_distros' from source: role '' defaults 30583 1726853781.84182: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.84195: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853781.84304: variable 'ansible_distribution' from source: facts 30583 1726853781.84308: variable '__network_rh_distros' from source: role '' defaults 30583 1726853781.84310: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.84319: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853781.84425: variable 'ansible_distribution' from source: facts 30583 1726853781.84429: variable '__network_rh_distros' from source: role '' defaults 30583 1726853781.84433: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.84458: variable 'network_provider' from source: set_fact 30583 1726853781.84473: variable 'ansible_facts' from source: unknown 30583 1726853781.84859: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853781.84862: when evaluation is False, skipping this task 30583 1726853781.84865: _execute() done 30583 1726853781.84867: dumping result to json 30583 1726853781.84870: done dumping result, returning 30583 1726853781.84881: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-000000002330] 30583 1726853781.84886: sending task result for task 02083763-bbaf-05ea-abc5-000000002330 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853781.85029: no more pending results, returning what we have 30583 1726853781.85032: results queue empty 30583 1726853781.85033: checking for any_errors_fatal 30583 1726853781.85041: done checking for any_errors_fatal 30583 1726853781.85042: checking for max_fail_percentage 30583 1726853781.85044: done checking for max_fail_percentage 30583 1726853781.85045: checking to see if all hosts have failed and the running result is not ok 30583 1726853781.85046: done checking to see if all hosts have failed 30583 1726853781.85046: getting the remaining hosts for this loop 30583 1726853781.85048: done getting the remaining hosts for this loop 30583 1726853781.85052: getting the next task for host managed_node2 30583 1726853781.85060: done getting next task for host managed_node2 30583 1726853781.85064: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853781.85068: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853781.85097: getting variables 30583 1726853781.85099: in VariableManager get_vars() 30583 1726853781.85144: Calling all_inventory to load vars for managed_node2 30583 1726853781.85146: Calling groups_inventory to load vars for managed_node2 30583 1726853781.85154: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853781.85164: Calling all_plugins_play to load vars for managed_node2 30583 1726853781.85166: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853781.85169: Calling groups_plugins_play to load vars for managed_node2 30583 1726853781.85183: done sending task result for task 02083763-bbaf-05ea-abc5-000000002330 30583 1726853781.85185: WORKER PROCESS EXITING 30583 1726853781.86145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853781.87006: done with get_vars() 30583 1726853781.87023: done getting variables 30583 1726853781.87068: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:36:21 -0400 (0:00:00.093) 0:01:57.208 ****** 30583 1726853781.87096: entering _queue_task() for managed_node2/package 30583 1726853781.87352: worker is 1 (out of 1 available) 30583 1726853781.87366: exiting _queue_task() for managed_node2/package 30583 1726853781.87380: done queuing things up, now waiting for results queue to drain 30583 1726853781.87382: waiting for pending results... 30583 1726853781.87581: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853781.87700: in run() - task 02083763-bbaf-05ea-abc5-000000002331 30583 1726853781.87714: variable 'ansible_search_path' from source: unknown 30583 1726853781.87717: variable 'ansible_search_path' from source: unknown 30583 1726853781.87746: calling self._execute() 30583 1726853781.87823: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853781.87827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853781.87838: variable 'omit' from source: magic vars 30583 1726853781.88118: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.88127: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853781.88216: variable 'network_state' from source: role '' defaults 30583 1726853781.88226: Evaluated conditional (network_state != {}): False 30583 1726853781.88229: when evaluation is False, skipping this task 30583 1726853781.88232: _execute() done 30583 1726853781.88236: dumping result to json 30583 1726853781.88238: done dumping result, returning 30583 1726853781.88245: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000002331] 30583 1726853781.88250: sending task result for task 02083763-bbaf-05ea-abc5-000000002331 30583 1726853781.88343: done sending task result for task 02083763-bbaf-05ea-abc5-000000002331 30583 1726853781.88346: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853781.88407: no more pending results, returning what we have 30583 1726853781.88411: results queue empty 30583 1726853781.88412: checking for any_errors_fatal 30583 1726853781.88419: done checking for any_errors_fatal 30583 1726853781.88420: checking for max_fail_percentage 30583 1726853781.88422: done checking for max_fail_percentage 30583 1726853781.88423: checking to see if all hosts have failed and the running result is not ok 30583 1726853781.88424: done checking to see if all hosts have failed 30583 1726853781.88424: getting the remaining hosts for this loop 30583 1726853781.88426: done getting the remaining hosts for this loop 30583 1726853781.88430: getting the next task for host managed_node2 30583 1726853781.88437: done getting next task for host managed_node2 30583 1726853781.88441: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853781.88446: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853781.88473: getting variables 30583 1726853781.88475: in VariableManager get_vars() 30583 1726853781.88514: Calling all_inventory to load vars for managed_node2 30583 1726853781.88516: Calling groups_inventory to load vars for managed_node2 30583 1726853781.88518: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853781.88527: Calling all_plugins_play to load vars for managed_node2 30583 1726853781.88530: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853781.88532: Calling groups_plugins_play to load vars for managed_node2 30583 1726853781.89324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853781.90333: done with get_vars() 30583 1726853781.90349: done getting variables 30583 1726853781.90396: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:36:21 -0400 (0:00:00.033) 0:01:57.241 ****** 30583 1726853781.90424: entering _queue_task() for managed_node2/package 30583 1726853781.90690: worker is 1 (out of 1 available) 30583 1726853781.90703: exiting _queue_task() for managed_node2/package 30583 1726853781.90716: done queuing things up, now waiting for results queue to drain 30583 1726853781.90717: waiting for pending results... 30583 1726853781.90916: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853781.91020: in run() - task 02083763-bbaf-05ea-abc5-000000002332 30583 1726853781.91031: variable 'ansible_search_path' from source: unknown 30583 1726853781.91035: variable 'ansible_search_path' from source: unknown 30583 1726853781.91066: calling self._execute() 30583 1726853781.91145: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853781.91148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853781.91162: variable 'omit' from source: magic vars 30583 1726853781.91440: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.91448: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853781.91538: variable 'network_state' from source: role '' defaults 30583 1726853781.91547: Evaluated conditional (network_state != {}): False 30583 1726853781.91550: when evaluation is False, skipping this task 30583 1726853781.91553: _execute() done 30583 1726853781.91556: dumping result to json 30583 1726853781.91561: done dumping result, returning 30583 1726853781.91567: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-000000002332] 30583 1726853781.91573: sending task result for task 02083763-bbaf-05ea-abc5-000000002332 30583 1726853781.91665: done sending task result for task 02083763-bbaf-05ea-abc5-000000002332 30583 1726853781.91668: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853781.91742: no more pending results, returning what we have 30583 1726853781.91746: results queue empty 30583 1726853781.91747: checking for any_errors_fatal 30583 1726853781.91763: done checking for any_errors_fatal 30583 1726853781.91763: checking for max_fail_percentage 30583 1726853781.91765: done checking for max_fail_percentage 30583 1726853781.91766: checking to see if all hosts have failed and the running result is not ok 30583 1726853781.91767: done checking to see if all hosts have failed 30583 1726853781.91768: getting the remaining hosts for this loop 30583 1726853781.91770: done getting the remaining hosts for this loop 30583 1726853781.91776: getting the next task for host managed_node2 30583 1726853781.91785: done getting next task for host managed_node2 30583 1726853781.91789: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853781.91793: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853781.91813: getting variables 30583 1726853781.91815: in VariableManager get_vars() 30583 1726853781.91853: Calling all_inventory to load vars for managed_node2 30583 1726853781.91855: Calling groups_inventory to load vars for managed_node2 30583 1726853781.91860: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853781.91868: Calling all_plugins_play to load vars for managed_node2 30583 1726853781.91875: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853781.91878: Calling groups_plugins_play to load vars for managed_node2 30583 1726853781.92661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853781.93547: done with get_vars() 30583 1726853781.93572: done getting variables 30583 1726853781.93619: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:36:21 -0400 (0:00:00.032) 0:01:57.273 ****** 30583 1726853781.93646: entering _queue_task() for managed_node2/service 30583 1726853781.93915: worker is 1 (out of 1 available) 30583 1726853781.93929: exiting _queue_task() for managed_node2/service 30583 1726853781.93942: done queuing things up, now waiting for results queue to drain 30583 1726853781.93943: waiting for pending results... 30583 1726853781.94143: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853781.94252: in run() - task 02083763-bbaf-05ea-abc5-000000002333 30583 1726853781.94262: variable 'ansible_search_path' from source: unknown 30583 1726853781.94274: variable 'ansible_search_path' from source: unknown 30583 1726853781.94305: calling self._execute() 30583 1726853781.94379: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853781.94384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853781.94475: variable 'omit' from source: magic vars 30583 1726853781.94678: variable 'ansible_distribution_major_version' from source: facts 30583 1726853781.94687: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853781.94775: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853781.94910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853781.96411: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853781.96466: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853781.96496: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853781.96521: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853781.96541: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853781.96604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.96625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.96644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.96675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.96685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.96716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.96732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.96748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.96779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.96789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.96815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853781.96831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853781.96847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.96874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853781.96885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853781.97002: variable 'network_connections' from source: include params 30583 1726853781.97011: variable 'interface' from source: play vars 30583 1726853781.97062: variable 'interface' from source: play vars 30583 1726853781.97114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853781.97224: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853781.97546: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853781.97569: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853781.97591: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853781.97623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853781.97639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853781.97662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853781.97680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853781.97720: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853781.97877: variable 'network_connections' from source: include params 30583 1726853781.97881: variable 'interface' from source: play vars 30583 1726853781.97925: variable 'interface' from source: play vars 30583 1726853781.97943: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853781.97946: when evaluation is False, skipping this task 30583 1726853781.97949: _execute() done 30583 1726853781.97951: dumping result to json 30583 1726853781.97953: done dumping result, returning 30583 1726853781.97963: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000002333] 30583 1726853781.97965: sending task result for task 02083763-bbaf-05ea-abc5-000000002333 30583 1726853781.98054: done sending task result for task 02083763-bbaf-05ea-abc5-000000002333 30583 1726853781.98067: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853781.98127: no more pending results, returning what we have 30583 1726853781.98130: results queue empty 30583 1726853781.98131: checking for any_errors_fatal 30583 1726853781.98137: done checking for any_errors_fatal 30583 1726853781.98138: checking for max_fail_percentage 30583 1726853781.98140: done checking for max_fail_percentage 30583 1726853781.98141: checking to see if all hosts have failed and the running result is not ok 30583 1726853781.98142: done checking to see if all hosts have failed 30583 1726853781.98143: getting the remaining hosts for this loop 30583 1726853781.98145: done getting the remaining hosts for this loop 30583 1726853781.98148: getting the next task for host managed_node2 30583 1726853781.98160: done getting next task for host managed_node2 30583 1726853781.98165: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853781.98170: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853781.98198: getting variables 30583 1726853781.98199: in VariableManager get_vars() 30583 1726853781.98245: Calling all_inventory to load vars for managed_node2 30583 1726853781.98248: Calling groups_inventory to load vars for managed_node2 30583 1726853781.98251: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853781.98261: Calling all_plugins_play to load vars for managed_node2 30583 1726853781.98264: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853781.98267: Calling groups_plugins_play to load vars for managed_node2 30583 1726853781.99265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853782.00123: done with get_vars() 30583 1726853782.00144: done getting variables 30583 1726853782.00192: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:36:22 -0400 (0:00:00.065) 0:01:57.339 ****** 30583 1726853782.00219: entering _queue_task() for managed_node2/service 30583 1726853782.00489: worker is 1 (out of 1 available) 30583 1726853782.00505: exiting _queue_task() for managed_node2/service 30583 1726853782.00519: done queuing things up, now waiting for results queue to drain 30583 1726853782.00520: waiting for pending results... 30583 1726853782.00720: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853782.00834: in run() - task 02083763-bbaf-05ea-abc5-000000002334 30583 1726853782.00844: variable 'ansible_search_path' from source: unknown 30583 1726853782.00848: variable 'ansible_search_path' from source: unknown 30583 1726853782.00883: calling self._execute() 30583 1726853782.00954: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853782.00960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853782.00969: variable 'omit' from source: magic vars 30583 1726853782.01252: variable 'ansible_distribution_major_version' from source: facts 30583 1726853782.01264: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853782.01382: variable 'network_provider' from source: set_fact 30583 1726853782.01386: variable 'network_state' from source: role '' defaults 30583 1726853782.01396: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853782.01402: variable 'omit' from source: magic vars 30583 1726853782.01439: variable 'omit' from source: magic vars 30583 1726853782.01459: variable 'network_service_name' from source: role '' defaults 30583 1726853782.01512: variable 'network_service_name' from source: role '' defaults 30583 1726853782.01588: variable '__network_provider_setup' from source: role '' defaults 30583 1726853782.01591: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853782.01638: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853782.01645: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853782.01692: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853782.01838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853782.03308: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853782.03364: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853782.03390: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853782.03416: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853782.03436: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853782.03497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853782.03518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853782.03535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853782.03562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853782.03573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853782.03606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853782.03622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853782.03638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853782.03663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853782.03674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853782.03826: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853782.03905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853782.03922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853782.03938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853782.04023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853782.04027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853782.04030: variable 'ansible_python' from source: facts 30583 1726853782.04044: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853782.04100: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853782.04154: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853782.04241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853782.04255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853782.04274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853782.04298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853782.04308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853782.04340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853782.04363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853782.04380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853782.04404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853782.04414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853782.04505: variable 'network_connections' from source: include params 30583 1726853782.04511: variable 'interface' from source: play vars 30583 1726853782.04563: variable 'interface' from source: play vars 30583 1726853782.04632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853782.04764: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853782.04802: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853782.04832: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853782.04863: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853782.04908: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853782.04928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853782.04950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853782.04975: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853782.05075: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853782.05199: variable 'network_connections' from source: include params 30583 1726853782.05203: variable 'interface' from source: play vars 30583 1726853782.05256: variable 'interface' from source: play vars 30583 1726853782.05284: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853782.05339: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853782.05522: variable 'network_connections' from source: include params 30583 1726853782.05525: variable 'interface' from source: play vars 30583 1726853782.05579: variable 'interface' from source: play vars 30583 1726853782.05594: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853782.05646: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853782.05829: variable 'network_connections' from source: include params 30583 1726853782.05833: variable 'interface' from source: play vars 30583 1726853782.05887: variable 'interface' from source: play vars 30583 1726853782.05919: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853782.05962: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853782.05966: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853782.06010: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853782.06141: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853782.06452: variable 'network_connections' from source: include params 30583 1726853782.06456: variable 'interface' from source: play vars 30583 1726853782.06500: variable 'interface' from source: play vars 30583 1726853782.06506: variable 'ansible_distribution' from source: facts 30583 1726853782.06509: variable '__network_rh_distros' from source: role '' defaults 30583 1726853782.06515: variable 'ansible_distribution_major_version' from source: facts 30583 1726853782.06525: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853782.06639: variable 'ansible_distribution' from source: facts 30583 1726853782.06643: variable '__network_rh_distros' from source: role '' defaults 30583 1726853782.06645: variable 'ansible_distribution_major_version' from source: facts 30583 1726853782.06661: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853782.06766: variable 'ansible_distribution' from source: facts 30583 1726853782.06769: variable '__network_rh_distros' from source: role '' defaults 30583 1726853782.06773: variable 'ansible_distribution_major_version' from source: facts 30583 1726853782.06798: variable 'network_provider' from source: set_fact 30583 1726853782.06816: variable 'omit' from source: magic vars 30583 1726853782.06837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853782.06857: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853782.06877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853782.06891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853782.06899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853782.06922: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853782.06925: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853782.06928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853782.07010: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853782.07014: Set connection var ansible_timeout to 10 30583 1726853782.07017: Set connection var ansible_connection to ssh 30583 1726853782.07023: Set connection var ansible_shell_executable to /bin/sh 30583 1726853782.07025: Set connection var ansible_shell_type to sh 30583 1726853782.07033: Set connection var ansible_pipelining to False 30583 1726853782.07054: variable 'ansible_shell_executable' from source: unknown 30583 1726853782.07057: variable 'ansible_connection' from source: unknown 30583 1726853782.07059: variable 'ansible_module_compression' from source: unknown 30583 1726853782.07063: variable 'ansible_shell_type' from source: unknown 30583 1726853782.07066: variable 'ansible_shell_executable' from source: unknown 30583 1726853782.07068: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853782.07072: variable 'ansible_pipelining' from source: unknown 30583 1726853782.07086: variable 'ansible_timeout' from source: unknown 30583 1726853782.07089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853782.07152: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853782.07160: variable 'omit' from source: magic vars 30583 1726853782.07168: starting attempt loop 30583 1726853782.07173: running the handler 30583 1726853782.07228: variable 'ansible_facts' from source: unknown 30583 1726853782.07714: _low_level_execute_command(): starting 30583 1726853782.07718: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853782.08214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853782.08218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.08221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853782.08223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.08276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853782.08280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853782.08282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853782.08363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853782.10110: stdout chunk (state=3): >>>/root <<< 30583 1726853782.10204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853782.10233: stderr chunk (state=3): >>><<< 30583 1726853782.10239: stdout chunk (state=3): >>><<< 30583 1726853782.10262: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853782.10269: _low_level_execute_command(): starting 30583 1726853782.10277: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515 `" && echo ansible-tmp-1726853782.102589-35864-123641713889515="` echo /root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515 `" ) && sleep 0' 30583 1726853782.10723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853782.10726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853782.10729: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.10731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853782.10732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853782.10734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.10777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853782.10798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853782.10801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853782.10865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853782.12858: stdout chunk (state=3): >>>ansible-tmp-1726853782.102589-35864-123641713889515=/root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515 <<< 30583 1726853782.12965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853782.12994: stderr chunk (state=3): >>><<< 30583 1726853782.12997: stdout chunk (state=3): >>><<< 30583 1726853782.13010: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853782.102589-35864-123641713889515=/root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853782.13040: variable 'ansible_module_compression' from source: unknown 30583 1726853782.13081: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853782.13132: variable 'ansible_facts' from source: unknown 30583 1726853782.13268: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515/AnsiballZ_systemd.py 30583 1726853782.13367: Sending initial data 30583 1726853782.13370: Sent initial data (155 bytes) 30583 1726853782.13813: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853782.13816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853782.13822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.13824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853782.13826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.13877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853782.13880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853782.13883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853782.13960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853782.15637: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30583 1726853782.15640: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853782.15702: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853782.15775: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpbw6ybepm /root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515/AnsiballZ_systemd.py <<< 30583 1726853782.15778: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515/AnsiballZ_systemd.py" <<< 30583 1726853782.15843: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpbw6ybepm" to remote "/root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515/AnsiballZ_systemd.py" <<< 30583 1726853782.15846: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515/AnsiballZ_systemd.py" <<< 30583 1726853782.17032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853782.17073: stderr chunk (state=3): >>><<< 30583 1726853782.17076: stdout chunk (state=3): >>><<< 30583 1726853782.17111: done transferring module to remote 30583 1726853782.17120: _low_level_execute_command(): starting 30583 1726853782.17124: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515/ /root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515/AnsiballZ_systemd.py && sleep 0' 30583 1726853782.17556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853782.17559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853782.17561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.17564: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853782.17566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853782.17567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.17613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853782.17625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853782.17696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853782.19562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853782.19585: stderr chunk (state=3): >>><<< 30583 1726853782.19588: stdout chunk (state=3): >>><<< 30583 1726853782.19599: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853782.19602: _low_level_execute_command(): starting 30583 1726853782.19607: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515/AnsiballZ_systemd.py && sleep 0' 30583 1726853782.20022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853782.20025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853782.20028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.20030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853782.20032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.20092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853782.20095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853782.20164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853782.50327: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4661248", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3307708416", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2029275000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 30583 1726853782.50343: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "syst<<< 30583 1726853782.50351: stdout chunk (state=3): >>>em.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853782.52336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853782.52370: stderr chunk (state=3): >>><<< 30583 1726853782.52375: stdout chunk (state=3): >>><<< 30583 1726853782.52392: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4661248", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3307708416", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2029275000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853782.52516: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853782.52531: _low_level_execute_command(): starting 30583 1726853782.52536: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853782.102589-35864-123641713889515/ > /dev/null 2>&1 && sleep 0' 30583 1726853782.52981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853782.52985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853782.52987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853782.52991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853782.52993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.53044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853782.53047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853782.53052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853782.53121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853782.55028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853782.55052: stderr chunk (state=3): >>><<< 30583 1726853782.55056: stdout chunk (state=3): >>><<< 30583 1726853782.55067: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853782.55074: handler run complete 30583 1726853782.55111: attempt loop complete, returning result 30583 1726853782.55114: _execute() done 30583 1726853782.55116: dumping result to json 30583 1726853782.55128: done dumping result, returning 30583 1726853782.55136: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-000000002334] 30583 1726853782.55145: sending task result for task 02083763-bbaf-05ea-abc5-000000002334 30583 1726853782.55381: done sending task result for task 02083763-bbaf-05ea-abc5-000000002334 30583 1726853782.55384: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853782.55440: no more pending results, returning what we have 30583 1726853782.55444: results queue empty 30583 1726853782.55445: checking for any_errors_fatal 30583 1726853782.55450: done checking for any_errors_fatal 30583 1726853782.55451: checking for max_fail_percentage 30583 1726853782.55453: done checking for max_fail_percentage 30583 1726853782.55454: checking to see if all hosts have failed and the running result is not ok 30583 1726853782.55454: done checking to see if all hosts have failed 30583 1726853782.55455: getting the remaining hosts for this loop 30583 1726853782.55457: done getting the remaining hosts for this loop 30583 1726853782.55462: getting the next task for host managed_node2 30583 1726853782.55469: done getting next task for host managed_node2 30583 1726853782.55474: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853782.55479: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853782.55491: getting variables 30583 1726853782.55492: in VariableManager get_vars() 30583 1726853782.55530: Calling all_inventory to load vars for managed_node2 30583 1726853782.55532: Calling groups_inventory to load vars for managed_node2 30583 1726853782.55534: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853782.55543: Calling all_plugins_play to load vars for managed_node2 30583 1726853782.55545: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853782.55548: Calling groups_plugins_play to load vars for managed_node2 30583 1726853782.56352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853782.57219: done with get_vars() 30583 1726853782.57235: done getting variables 30583 1726853782.57282: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:36:22 -0400 (0:00:00.570) 0:01:57.910 ****** 30583 1726853782.57312: entering _queue_task() for managed_node2/service 30583 1726853782.57548: worker is 1 (out of 1 available) 30583 1726853782.57564: exiting _queue_task() for managed_node2/service 30583 1726853782.57579: done queuing things up, now waiting for results queue to drain 30583 1726853782.57581: waiting for pending results... 30583 1726853782.57770: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853782.57864: in run() - task 02083763-bbaf-05ea-abc5-000000002335 30583 1726853782.57874: variable 'ansible_search_path' from source: unknown 30583 1726853782.57878: variable 'ansible_search_path' from source: unknown 30583 1726853782.57907: calling self._execute() 30583 1726853782.57988: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853782.57992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853782.58000: variable 'omit' from source: magic vars 30583 1726853782.58290: variable 'ansible_distribution_major_version' from source: facts 30583 1726853782.58298: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853782.58379: variable 'network_provider' from source: set_fact 30583 1726853782.58382: Evaluated conditional (network_provider == "nm"): True 30583 1726853782.58445: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853782.58508: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853782.58623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853782.60283: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853782.60328: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853782.60354: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853782.60382: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853782.60402: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853782.60464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853782.60485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853782.60503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853782.60530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853782.60541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853782.60575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853782.60591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853782.60608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853782.60634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853782.60645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853782.60675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853782.60691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853782.60706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853782.60730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853782.60742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853782.60837: variable 'network_connections' from source: include params 30583 1726853782.60848: variable 'interface' from source: play vars 30583 1726853782.60898: variable 'interface' from source: play vars 30583 1726853782.60961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853782.61064: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853782.61093: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853782.61114: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853782.61133: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853782.61166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853782.61185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853782.61203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853782.61219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853782.61260: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853782.61412: variable 'network_connections' from source: include params 30583 1726853782.61416: variable 'interface' from source: play vars 30583 1726853782.61461: variable 'interface' from source: play vars 30583 1726853782.61483: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853782.61486: when evaluation is False, skipping this task 30583 1726853782.61489: _execute() done 30583 1726853782.61491: dumping result to json 30583 1726853782.61494: done dumping result, returning 30583 1726853782.61503: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-000000002335] 30583 1726853782.61515: sending task result for task 02083763-bbaf-05ea-abc5-000000002335 30583 1726853782.61598: done sending task result for task 02083763-bbaf-05ea-abc5-000000002335 30583 1726853782.61601: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853782.61648: no more pending results, returning what we have 30583 1726853782.61652: results queue empty 30583 1726853782.61653: checking for any_errors_fatal 30583 1726853782.61681: done checking for any_errors_fatal 30583 1726853782.61682: checking for max_fail_percentage 30583 1726853782.61684: done checking for max_fail_percentage 30583 1726853782.61685: checking to see if all hosts have failed and the running result is not ok 30583 1726853782.61685: done checking to see if all hosts have failed 30583 1726853782.61686: getting the remaining hosts for this loop 30583 1726853782.61688: done getting the remaining hosts for this loop 30583 1726853782.61692: getting the next task for host managed_node2 30583 1726853782.61699: done getting next task for host managed_node2 30583 1726853782.61703: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853782.61707: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853782.61731: getting variables 30583 1726853782.61733: in VariableManager get_vars() 30583 1726853782.61782: Calling all_inventory to load vars for managed_node2 30583 1726853782.61785: Calling groups_inventory to load vars for managed_node2 30583 1726853782.61787: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853782.61796: Calling all_plugins_play to load vars for managed_node2 30583 1726853782.61799: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853782.61801: Calling groups_plugins_play to load vars for managed_node2 30583 1726853782.62717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853782.63580: done with get_vars() 30583 1726853782.63596: done getting variables 30583 1726853782.63639: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:36:22 -0400 (0:00:00.063) 0:01:57.973 ****** 30583 1726853782.63665: entering _queue_task() for managed_node2/service 30583 1726853782.63917: worker is 1 (out of 1 available) 30583 1726853782.63931: exiting _queue_task() for managed_node2/service 30583 1726853782.63944: done queuing things up, now waiting for results queue to drain 30583 1726853782.63945: waiting for pending results... 30583 1726853782.64140: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853782.64233: in run() - task 02083763-bbaf-05ea-abc5-000000002336 30583 1726853782.64246: variable 'ansible_search_path' from source: unknown 30583 1726853782.64250: variable 'ansible_search_path' from source: unknown 30583 1726853782.64286: calling self._execute() 30583 1726853782.64360: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853782.64364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853782.64370: variable 'omit' from source: magic vars 30583 1726853782.64663: variable 'ansible_distribution_major_version' from source: facts 30583 1726853782.64672: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853782.64751: variable 'network_provider' from source: set_fact 30583 1726853782.64755: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853782.64760: when evaluation is False, skipping this task 30583 1726853782.64763: _execute() done 30583 1726853782.64766: dumping result to json 30583 1726853782.64768: done dumping result, returning 30583 1726853782.64776: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-000000002336] 30583 1726853782.64780: sending task result for task 02083763-bbaf-05ea-abc5-000000002336 30583 1726853782.64873: done sending task result for task 02083763-bbaf-05ea-abc5-000000002336 30583 1726853782.64876: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853782.64921: no more pending results, returning what we have 30583 1726853782.64925: results queue empty 30583 1726853782.64926: checking for any_errors_fatal 30583 1726853782.64935: done checking for any_errors_fatal 30583 1726853782.64936: checking for max_fail_percentage 30583 1726853782.64938: done checking for max_fail_percentage 30583 1726853782.64939: checking to see if all hosts have failed and the running result is not ok 30583 1726853782.64940: done checking to see if all hosts have failed 30583 1726853782.64941: getting the remaining hosts for this loop 30583 1726853782.64942: done getting the remaining hosts for this loop 30583 1726853782.64946: getting the next task for host managed_node2 30583 1726853782.64954: done getting next task for host managed_node2 30583 1726853782.64957: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853782.64964: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853782.64991: getting variables 30583 1726853782.64993: in VariableManager get_vars() 30583 1726853782.65033: Calling all_inventory to load vars for managed_node2 30583 1726853782.65037: Calling groups_inventory to load vars for managed_node2 30583 1726853782.65039: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853782.65048: Calling all_plugins_play to load vars for managed_node2 30583 1726853782.65050: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853782.65052: Calling groups_plugins_play to load vars for managed_node2 30583 1726853782.65855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853782.66847: done with get_vars() 30583 1726853782.66866: done getting variables 30583 1726853782.66913: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:36:22 -0400 (0:00:00.032) 0:01:58.006 ****** 30583 1726853782.66941: entering _queue_task() for managed_node2/copy 30583 1726853782.67205: worker is 1 (out of 1 available) 30583 1726853782.67218: exiting _queue_task() for managed_node2/copy 30583 1726853782.67231: done queuing things up, now waiting for results queue to drain 30583 1726853782.67233: waiting for pending results... 30583 1726853782.67433: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853782.67515: in run() - task 02083763-bbaf-05ea-abc5-000000002337 30583 1726853782.67526: variable 'ansible_search_path' from source: unknown 30583 1726853782.67531: variable 'ansible_search_path' from source: unknown 30583 1726853782.67563: calling self._execute() 30583 1726853782.67641: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853782.67644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853782.67653: variable 'omit' from source: magic vars 30583 1726853782.67945: variable 'ansible_distribution_major_version' from source: facts 30583 1726853782.67953: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853782.68035: variable 'network_provider' from source: set_fact 30583 1726853782.68039: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853782.68041: when evaluation is False, skipping this task 30583 1726853782.68044: _execute() done 30583 1726853782.68046: dumping result to json 30583 1726853782.68049: done dumping result, returning 30583 1726853782.68061: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-000000002337] 30583 1726853782.68064: sending task result for task 02083763-bbaf-05ea-abc5-000000002337 30583 1726853782.68151: done sending task result for task 02083763-bbaf-05ea-abc5-000000002337 30583 1726853782.68153: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853782.68203: no more pending results, returning what we have 30583 1726853782.68207: results queue empty 30583 1726853782.68208: checking for any_errors_fatal 30583 1726853782.68214: done checking for any_errors_fatal 30583 1726853782.68215: checking for max_fail_percentage 30583 1726853782.68218: done checking for max_fail_percentage 30583 1726853782.68220: checking to see if all hosts have failed and the running result is not ok 30583 1726853782.68220: done checking to see if all hosts have failed 30583 1726853782.68221: getting the remaining hosts for this loop 30583 1726853782.68223: done getting the remaining hosts for this loop 30583 1726853782.68226: getting the next task for host managed_node2 30583 1726853782.68235: done getting next task for host managed_node2 30583 1726853782.68239: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853782.68244: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853782.68270: getting variables 30583 1726853782.68273: in VariableManager get_vars() 30583 1726853782.68312: Calling all_inventory to load vars for managed_node2 30583 1726853782.68315: Calling groups_inventory to load vars for managed_node2 30583 1726853782.68317: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853782.68327: Calling all_plugins_play to load vars for managed_node2 30583 1726853782.68330: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853782.68333: Calling groups_plugins_play to load vars for managed_node2 30583 1726853782.69154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853782.70674: done with get_vars() 30583 1726853782.70694: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:36:22 -0400 (0:00:00.038) 0:01:58.044 ****** 30583 1726853782.70754: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853782.70996: worker is 1 (out of 1 available) 30583 1726853782.71009: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853782.71023: done queuing things up, now waiting for results queue to drain 30583 1726853782.71024: waiting for pending results... 30583 1726853782.71220: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853782.71319: in run() - task 02083763-bbaf-05ea-abc5-000000002338 30583 1726853782.71330: variable 'ansible_search_path' from source: unknown 30583 1726853782.71334: variable 'ansible_search_path' from source: unknown 30583 1726853782.71367: calling self._execute() 30583 1726853782.71446: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853782.71449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853782.71458: variable 'omit' from source: magic vars 30583 1726853782.71755: variable 'ansible_distribution_major_version' from source: facts 30583 1726853782.71767: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853782.71774: variable 'omit' from source: magic vars 30583 1726853782.71823: variable 'omit' from source: magic vars 30583 1726853782.71936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853782.73404: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853782.73456: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853782.73487: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853782.73513: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853782.73533: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853782.73595: variable 'network_provider' from source: set_fact 30583 1726853782.73692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853782.73711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853782.73728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853782.73755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853782.73767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853782.73822: variable 'omit' from source: magic vars 30583 1726853782.73897: variable 'omit' from source: magic vars 30583 1726853782.73966: variable 'network_connections' from source: include params 30583 1726853782.73977: variable 'interface' from source: play vars 30583 1726853782.74017: variable 'interface' from source: play vars 30583 1726853782.74123: variable 'omit' from source: magic vars 30583 1726853782.74130: variable '__lsr_ansible_managed' from source: task vars 30583 1726853782.74172: variable '__lsr_ansible_managed' from source: task vars 30583 1726853782.74578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853782.74708: Loaded config def from plugin (lookup/template) 30583 1726853782.74712: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853782.74736: File lookup term: get_ansible_managed.j2 30583 1726853782.74739: variable 'ansible_search_path' from source: unknown 30583 1726853782.74743: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853782.74754: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853782.74768: variable 'ansible_search_path' from source: unknown 30583 1726853782.77924: variable 'ansible_managed' from source: unknown 30583 1726853782.78011: variable 'omit' from source: magic vars 30583 1726853782.78032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853782.78051: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853782.78065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853782.78081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853782.78089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853782.78110: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853782.78113: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853782.78116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853782.78175: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853782.78187: Set connection var ansible_timeout to 10 30583 1726853782.78190: Set connection var ansible_connection to ssh 30583 1726853782.78192: Set connection var ansible_shell_executable to /bin/sh 30583 1726853782.78194: Set connection var ansible_shell_type to sh 30583 1726853782.78200: Set connection var ansible_pipelining to False 30583 1726853782.78218: variable 'ansible_shell_executable' from source: unknown 30583 1726853782.78220: variable 'ansible_connection' from source: unknown 30583 1726853782.78223: variable 'ansible_module_compression' from source: unknown 30583 1726853782.78225: variable 'ansible_shell_type' from source: unknown 30583 1726853782.78227: variable 'ansible_shell_executable' from source: unknown 30583 1726853782.78230: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853782.78232: variable 'ansible_pipelining' from source: unknown 30583 1726853782.78236: variable 'ansible_timeout' from source: unknown 30583 1726853782.78240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853782.78331: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853782.78343: variable 'omit' from source: magic vars 30583 1726853782.78346: starting attempt loop 30583 1726853782.78349: running the handler 30583 1726853782.78361: _low_level_execute_command(): starting 30583 1726853782.78364: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853782.78872: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853782.78876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.78878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853782.78880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.78927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853782.78930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853782.78932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853782.79018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853782.80757: stdout chunk (state=3): >>>/root <<< 30583 1726853782.80855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853782.80887: stderr chunk (state=3): >>><<< 30583 1726853782.80890: stdout chunk (state=3): >>><<< 30583 1726853782.80910: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853782.80920: _low_level_execute_command(): starting 30583 1726853782.80926: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187 `" && echo ansible-tmp-1726853782.8091047-35878-204444083798187="` echo /root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187 `" ) && sleep 0' 30583 1726853782.81350: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853782.81353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853782.81356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853782.81358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853782.81361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.81409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853782.81412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853782.81493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853782.83509: stdout chunk (state=3): >>>ansible-tmp-1726853782.8091047-35878-204444083798187=/root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187 <<< 30583 1726853782.83619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853782.83642: stderr chunk (state=3): >>><<< 30583 1726853782.83645: stdout chunk (state=3): >>><<< 30583 1726853782.83660: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853782.8091047-35878-204444083798187=/root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853782.83703: variable 'ansible_module_compression' from source: unknown 30583 1726853782.83742: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853782.83786: variable 'ansible_facts' from source: unknown 30583 1726853782.83878: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187/AnsiballZ_network_connections.py 30583 1726853782.83979: Sending initial data 30583 1726853782.83983: Sent initial data (168 bytes) 30583 1726853782.84425: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853782.84429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853782.84435: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.84437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853782.84439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853782.84441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.84489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853782.84492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853782.84566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853782.86190: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853782.86195: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853782.86259: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853782.86333: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpfhv564bk /root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187/AnsiballZ_network_connections.py <<< 30583 1726853782.86335: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187/AnsiballZ_network_connections.py" <<< 30583 1726853782.86400: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpfhv564bk" to remote "/root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187/AnsiballZ_network_connections.py" <<< 30583 1726853782.86403: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187/AnsiballZ_network_connections.py" <<< 30583 1726853782.87241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853782.87287: stderr chunk (state=3): >>><<< 30583 1726853782.87290: stdout chunk (state=3): >>><<< 30583 1726853782.87324: done transferring module to remote 30583 1726853782.87333: _low_level_execute_command(): starting 30583 1726853782.87337: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187/ /root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187/AnsiballZ_network_connections.py && sleep 0' 30583 1726853782.87784: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853782.87787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853782.87790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853782.87792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853782.87794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.87847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853782.87853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853782.87856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853782.87922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853782.89801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853782.89827: stderr chunk (state=3): >>><<< 30583 1726853782.89830: stdout chunk (state=3): >>><<< 30583 1726853782.89844: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853782.89846: _low_level_execute_command(): starting 30583 1726853782.89854: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187/AnsiballZ_network_connections.py && sleep 0' 30583 1726853782.90293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853782.90296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.90298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853782.90301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853782.90307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853782.90354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853782.90361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853782.90364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853782.90438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853783.16173: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30583 1726853783.18139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853783.18164: stderr chunk (state=3): >>><<< 30583 1726853783.18167: stdout chunk (state=3): >>><<< 30583 1726853783.18187: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853783.18215: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853783.18224: _low_level_execute_command(): starting 30583 1726853783.18227: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853782.8091047-35878-204444083798187/ > /dev/null 2>&1 && sleep 0' 30583 1726853783.18684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853783.18688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853783.18690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853783.18692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853783.18742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853783.18745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853783.18747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853783.18827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853783.20755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853783.20786: stderr chunk (state=3): >>><<< 30583 1726853783.20789: stdout chunk (state=3): >>><<< 30583 1726853783.20802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853783.20808: handler run complete 30583 1726853783.20829: attempt loop complete, returning result 30583 1726853783.20832: _execute() done 30583 1726853783.20835: dumping result to json 30583 1726853783.20838: done dumping result, returning 30583 1726853783.20847: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-000000002338] 30583 1726853783.20854: sending task result for task 02083763-bbaf-05ea-abc5-000000002338 30583 1726853783.20951: done sending task result for task 02083763-bbaf-05ea-abc5-000000002338 30583 1726853783.20955: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd skipped because already active 30583 1726853783.21072: no more pending results, returning what we have 30583 1726853783.21076: results queue empty 30583 1726853783.21077: checking for any_errors_fatal 30583 1726853783.21083: done checking for any_errors_fatal 30583 1726853783.21084: checking for max_fail_percentage 30583 1726853783.21085: done checking for max_fail_percentage 30583 1726853783.21086: checking to see if all hosts have failed and the running result is not ok 30583 1726853783.21087: done checking to see if all hosts have failed 30583 1726853783.21088: getting the remaining hosts for this loop 30583 1726853783.21089: done getting the remaining hosts for this loop 30583 1726853783.21093: getting the next task for host managed_node2 30583 1726853783.21100: done getting next task for host managed_node2 30583 1726853783.21104: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853783.21109: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853783.21121: getting variables 30583 1726853783.21122: in VariableManager get_vars() 30583 1726853783.21166: Calling all_inventory to load vars for managed_node2 30583 1726853783.21169: Calling groups_inventory to load vars for managed_node2 30583 1726853783.21176: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853783.21185: Calling all_plugins_play to load vars for managed_node2 30583 1726853783.21188: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853783.21190: Calling groups_plugins_play to load vars for managed_node2 30583 1726853783.22176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853783.23027: done with get_vars() 30583 1726853783.23045: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:36:23 -0400 (0:00:00.523) 0:01:58.568 ****** 30583 1726853783.23111: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853783.23369: worker is 1 (out of 1 available) 30583 1726853783.23386: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853783.23400: done queuing things up, now waiting for results queue to drain 30583 1726853783.23401: waiting for pending results... 30583 1726853783.23593: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853783.23688: in run() - task 02083763-bbaf-05ea-abc5-000000002339 30583 1726853783.23701: variable 'ansible_search_path' from source: unknown 30583 1726853783.23704: variable 'ansible_search_path' from source: unknown 30583 1726853783.23739: calling self._execute() 30583 1726853783.23811: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.23815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.23823: variable 'omit' from source: magic vars 30583 1726853783.24112: variable 'ansible_distribution_major_version' from source: facts 30583 1726853783.24122: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853783.24207: variable 'network_state' from source: role '' defaults 30583 1726853783.24214: Evaluated conditional (network_state != {}): False 30583 1726853783.24217: when evaluation is False, skipping this task 30583 1726853783.24220: _execute() done 30583 1726853783.24222: dumping result to json 30583 1726853783.24224: done dumping result, returning 30583 1726853783.24232: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-000000002339] 30583 1726853783.24238: sending task result for task 02083763-bbaf-05ea-abc5-000000002339 30583 1726853783.24326: done sending task result for task 02083763-bbaf-05ea-abc5-000000002339 30583 1726853783.24328: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853783.24386: no more pending results, returning what we have 30583 1726853783.24390: results queue empty 30583 1726853783.24391: checking for any_errors_fatal 30583 1726853783.24404: done checking for any_errors_fatal 30583 1726853783.24405: checking for max_fail_percentage 30583 1726853783.24407: done checking for max_fail_percentage 30583 1726853783.24408: checking to see if all hosts have failed and the running result is not ok 30583 1726853783.24409: done checking to see if all hosts have failed 30583 1726853783.24409: getting the remaining hosts for this loop 30583 1726853783.24412: done getting the remaining hosts for this loop 30583 1726853783.24415: getting the next task for host managed_node2 30583 1726853783.24423: done getting next task for host managed_node2 30583 1726853783.24426: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853783.24431: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853783.24456: getting variables 30583 1726853783.24460: in VariableManager get_vars() 30583 1726853783.24504: Calling all_inventory to load vars for managed_node2 30583 1726853783.24507: Calling groups_inventory to load vars for managed_node2 30583 1726853783.24509: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853783.24518: Calling all_plugins_play to load vars for managed_node2 30583 1726853783.24520: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853783.24522: Calling groups_plugins_play to load vars for managed_node2 30583 1726853783.25333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853783.26199: done with get_vars() 30583 1726853783.26219: done getting variables 30583 1726853783.26266: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:36:23 -0400 (0:00:00.031) 0:01:58.600 ****** 30583 1726853783.26293: entering _queue_task() for managed_node2/debug 30583 1726853783.26561: worker is 1 (out of 1 available) 30583 1726853783.26578: exiting _queue_task() for managed_node2/debug 30583 1726853783.26590: done queuing things up, now waiting for results queue to drain 30583 1726853783.26592: waiting for pending results... 30583 1726853783.26787: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853783.26870: in run() - task 02083763-bbaf-05ea-abc5-00000000233a 30583 1726853783.26884: variable 'ansible_search_path' from source: unknown 30583 1726853783.26888: variable 'ansible_search_path' from source: unknown 30583 1726853783.26918: calling self._execute() 30583 1726853783.27005: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.27008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.27017: variable 'omit' from source: magic vars 30583 1726853783.27307: variable 'ansible_distribution_major_version' from source: facts 30583 1726853783.27316: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853783.27322: variable 'omit' from source: magic vars 30583 1726853783.27380: variable 'omit' from source: magic vars 30583 1726853783.27405: variable 'omit' from source: magic vars 30583 1726853783.27440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853783.27469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853783.27489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853783.27503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853783.27512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853783.27537: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853783.27540: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.27542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.27617: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853783.27621: Set connection var ansible_timeout to 10 30583 1726853783.27624: Set connection var ansible_connection to ssh 30583 1726853783.27629: Set connection var ansible_shell_executable to /bin/sh 30583 1726853783.27632: Set connection var ansible_shell_type to sh 30583 1726853783.27639: Set connection var ansible_pipelining to False 30583 1726853783.27660: variable 'ansible_shell_executable' from source: unknown 30583 1726853783.27663: variable 'ansible_connection' from source: unknown 30583 1726853783.27666: variable 'ansible_module_compression' from source: unknown 30583 1726853783.27668: variable 'ansible_shell_type' from source: unknown 30583 1726853783.27672: variable 'ansible_shell_executable' from source: unknown 30583 1726853783.27674: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.27677: variable 'ansible_pipelining' from source: unknown 30583 1726853783.27679: variable 'ansible_timeout' from source: unknown 30583 1726853783.27681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.27783: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853783.27792: variable 'omit' from source: magic vars 30583 1726853783.27799: starting attempt loop 30583 1726853783.27802: running the handler 30583 1726853783.27902: variable '__network_connections_result' from source: set_fact 30583 1726853783.27946: handler run complete 30583 1726853783.27962: attempt loop complete, returning result 30583 1726853783.27965: _execute() done 30583 1726853783.27967: dumping result to json 30583 1726853783.27969: done dumping result, returning 30583 1726853783.27978: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-00000000233a] 30583 1726853783.27982: sending task result for task 02083763-bbaf-05ea-abc5-00000000233a 30583 1726853783.28066: done sending task result for task 02083763-bbaf-05ea-abc5-00000000233a 30583 1726853783.28069: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd skipped because already active" ] } 30583 1726853783.28145: no more pending results, returning what we have 30583 1726853783.28148: results queue empty 30583 1726853783.28149: checking for any_errors_fatal 30583 1726853783.28155: done checking for any_errors_fatal 30583 1726853783.28155: checking for max_fail_percentage 30583 1726853783.28160: done checking for max_fail_percentage 30583 1726853783.28161: checking to see if all hosts have failed and the running result is not ok 30583 1726853783.28162: done checking to see if all hosts have failed 30583 1726853783.28162: getting the remaining hosts for this loop 30583 1726853783.28164: done getting the remaining hosts for this loop 30583 1726853783.28168: getting the next task for host managed_node2 30583 1726853783.28177: done getting next task for host managed_node2 30583 1726853783.28182: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853783.28187: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853783.28199: getting variables 30583 1726853783.28201: in VariableManager get_vars() 30583 1726853783.28244: Calling all_inventory to load vars for managed_node2 30583 1726853783.28247: Calling groups_inventory to load vars for managed_node2 30583 1726853783.28249: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853783.28260: Calling all_plugins_play to load vars for managed_node2 30583 1726853783.28263: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853783.28265: Calling groups_plugins_play to load vars for managed_node2 30583 1726853783.29204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853783.30069: done with get_vars() 30583 1726853783.30091: done getting variables 30583 1726853783.30135: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:36:23 -0400 (0:00:00.038) 0:01:58.638 ****** 30583 1726853783.30168: entering _queue_task() for managed_node2/debug 30583 1726853783.30434: worker is 1 (out of 1 available) 30583 1726853783.30448: exiting _queue_task() for managed_node2/debug 30583 1726853783.30464: done queuing things up, now waiting for results queue to drain 30583 1726853783.30466: waiting for pending results... 30583 1726853783.30661: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853783.30756: in run() - task 02083763-bbaf-05ea-abc5-00000000233b 30583 1726853783.30769: variable 'ansible_search_path' from source: unknown 30583 1726853783.30773: variable 'ansible_search_path' from source: unknown 30583 1726853783.30806: calling self._execute() 30583 1726853783.30887: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.30890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.30899: variable 'omit' from source: magic vars 30583 1726853783.31184: variable 'ansible_distribution_major_version' from source: facts 30583 1726853783.31194: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853783.31200: variable 'omit' from source: magic vars 30583 1726853783.31248: variable 'omit' from source: magic vars 30583 1726853783.31275: variable 'omit' from source: magic vars 30583 1726853783.31310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853783.31336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853783.31354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853783.31368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853783.31380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853783.31404: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853783.31407: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.31411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.31487: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853783.31491: Set connection var ansible_timeout to 10 30583 1726853783.31495: Set connection var ansible_connection to ssh 30583 1726853783.31500: Set connection var ansible_shell_executable to /bin/sh 30583 1726853783.31502: Set connection var ansible_shell_type to sh 30583 1726853783.31509: Set connection var ansible_pipelining to False 30583 1726853783.31528: variable 'ansible_shell_executable' from source: unknown 30583 1726853783.31530: variable 'ansible_connection' from source: unknown 30583 1726853783.31533: variable 'ansible_module_compression' from source: unknown 30583 1726853783.31535: variable 'ansible_shell_type' from source: unknown 30583 1726853783.31538: variable 'ansible_shell_executable' from source: unknown 30583 1726853783.31540: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.31544: variable 'ansible_pipelining' from source: unknown 30583 1726853783.31546: variable 'ansible_timeout' from source: unknown 30583 1726853783.31550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.31652: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853783.31663: variable 'omit' from source: magic vars 30583 1726853783.31667: starting attempt loop 30583 1726853783.31670: running the handler 30583 1726853783.31711: variable '__network_connections_result' from source: set_fact 30583 1726853783.31767: variable '__network_connections_result' from source: set_fact 30583 1726853783.31847: handler run complete 30583 1726853783.31864: attempt loop complete, returning result 30583 1726853783.31867: _execute() done 30583 1726853783.31870: dumping result to json 30583 1726853783.31874: done dumping result, returning 30583 1726853783.31882: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-00000000233b] 30583 1726853783.31887: sending task result for task 02083763-bbaf-05ea-abc5-00000000233b 30583 1726853783.31982: done sending task result for task 02083763-bbaf-05ea-abc5-00000000233b 30583 1726853783.31985: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 3512d7ba-d156-408a-9044-dcd593676efd skipped because already active" ] } } 30583 1726853783.32089: no more pending results, returning what we have 30583 1726853783.32093: results queue empty 30583 1726853783.32094: checking for any_errors_fatal 30583 1726853783.32101: done checking for any_errors_fatal 30583 1726853783.32102: checking for max_fail_percentage 30583 1726853783.32103: done checking for max_fail_percentage 30583 1726853783.32104: checking to see if all hosts have failed and the running result is not ok 30583 1726853783.32105: done checking to see if all hosts have failed 30583 1726853783.32106: getting the remaining hosts for this loop 30583 1726853783.32108: done getting the remaining hosts for this loop 30583 1726853783.32111: getting the next task for host managed_node2 30583 1726853783.32118: done getting next task for host managed_node2 30583 1726853783.32122: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853783.32126: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853783.32137: getting variables 30583 1726853783.32139: in VariableManager get_vars() 30583 1726853783.32185: Calling all_inventory to load vars for managed_node2 30583 1726853783.32188: Calling groups_inventory to load vars for managed_node2 30583 1726853783.32196: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853783.32204: Calling all_plugins_play to load vars for managed_node2 30583 1726853783.32207: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853783.32209: Calling groups_plugins_play to load vars for managed_node2 30583 1726853783.33005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853783.33969: done with get_vars() 30583 1726853783.33988: done getting variables 30583 1726853783.34032: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:36:23 -0400 (0:00:00.038) 0:01:58.677 ****** 30583 1726853783.34056: entering _queue_task() for managed_node2/debug 30583 1726853783.34320: worker is 1 (out of 1 available) 30583 1726853783.34335: exiting _queue_task() for managed_node2/debug 30583 1726853783.34349: done queuing things up, now waiting for results queue to drain 30583 1726853783.34350: waiting for pending results... 30583 1726853783.34543: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853783.34644: in run() - task 02083763-bbaf-05ea-abc5-00000000233c 30583 1726853783.34655: variable 'ansible_search_path' from source: unknown 30583 1726853783.34661: variable 'ansible_search_path' from source: unknown 30583 1726853783.34692: calling self._execute() 30583 1726853783.34769: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.34774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.34783: variable 'omit' from source: magic vars 30583 1726853783.35068: variable 'ansible_distribution_major_version' from source: facts 30583 1726853783.35080: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853783.35164: variable 'network_state' from source: role '' defaults 30583 1726853783.35170: Evaluated conditional (network_state != {}): False 30583 1726853783.35175: when evaluation is False, skipping this task 30583 1726853783.35177: _execute() done 30583 1726853783.35180: dumping result to json 30583 1726853783.35182: done dumping result, returning 30583 1726853783.35191: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-00000000233c] 30583 1726853783.35195: sending task result for task 02083763-bbaf-05ea-abc5-00000000233c 30583 1726853783.35282: done sending task result for task 02083763-bbaf-05ea-abc5-00000000233c 30583 1726853783.35284: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853783.35331: no more pending results, returning what we have 30583 1726853783.35335: results queue empty 30583 1726853783.35336: checking for any_errors_fatal 30583 1726853783.35347: done checking for any_errors_fatal 30583 1726853783.35347: checking for max_fail_percentage 30583 1726853783.35349: done checking for max_fail_percentage 30583 1726853783.35350: checking to see if all hosts have failed and the running result is not ok 30583 1726853783.35351: done checking to see if all hosts have failed 30583 1726853783.35352: getting the remaining hosts for this loop 30583 1726853783.35354: done getting the remaining hosts for this loop 30583 1726853783.35359: getting the next task for host managed_node2 30583 1726853783.35368: done getting next task for host managed_node2 30583 1726853783.35374: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853783.35378: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853783.35404: getting variables 30583 1726853783.35406: in VariableManager get_vars() 30583 1726853783.35447: Calling all_inventory to load vars for managed_node2 30583 1726853783.35450: Calling groups_inventory to load vars for managed_node2 30583 1726853783.35452: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853783.35464: Calling all_plugins_play to load vars for managed_node2 30583 1726853783.35466: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853783.35469: Calling groups_plugins_play to load vars for managed_node2 30583 1726853783.36249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853783.37109: done with get_vars() 30583 1726853783.37128: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:36:23 -0400 (0:00:00.031) 0:01:58.709 ****** 30583 1726853783.37204: entering _queue_task() for managed_node2/ping 30583 1726853783.37462: worker is 1 (out of 1 available) 30583 1726853783.37478: exiting _queue_task() for managed_node2/ping 30583 1726853783.37491: done queuing things up, now waiting for results queue to drain 30583 1726853783.37492: waiting for pending results... 30583 1726853783.37687: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853783.37793: in run() - task 02083763-bbaf-05ea-abc5-00000000233d 30583 1726853783.37804: variable 'ansible_search_path' from source: unknown 30583 1726853783.37808: variable 'ansible_search_path' from source: unknown 30583 1726853783.37839: calling self._execute() 30583 1726853783.37917: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.37920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.37929: variable 'omit' from source: magic vars 30583 1726853783.38202: variable 'ansible_distribution_major_version' from source: facts 30583 1726853783.38212: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853783.38218: variable 'omit' from source: magic vars 30583 1726853783.38269: variable 'omit' from source: magic vars 30583 1726853783.38290: variable 'omit' from source: magic vars 30583 1726853783.38323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853783.38349: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853783.38368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853783.38384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853783.38395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853783.38419: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853783.38422: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.38426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.38499: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853783.38504: Set connection var ansible_timeout to 10 30583 1726853783.38507: Set connection var ansible_connection to ssh 30583 1726853783.38512: Set connection var ansible_shell_executable to /bin/sh 30583 1726853783.38515: Set connection var ansible_shell_type to sh 30583 1726853783.38523: Set connection var ansible_pipelining to False 30583 1726853783.38540: variable 'ansible_shell_executable' from source: unknown 30583 1726853783.38543: variable 'ansible_connection' from source: unknown 30583 1726853783.38546: variable 'ansible_module_compression' from source: unknown 30583 1726853783.38548: variable 'ansible_shell_type' from source: unknown 30583 1726853783.38550: variable 'ansible_shell_executable' from source: unknown 30583 1726853783.38552: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.38557: variable 'ansible_pipelining' from source: unknown 30583 1726853783.38562: variable 'ansible_timeout' from source: unknown 30583 1726853783.38564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.38710: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853783.38721: variable 'omit' from source: magic vars 30583 1726853783.38726: starting attempt loop 30583 1726853783.38728: running the handler 30583 1726853783.38739: _low_level_execute_command(): starting 30583 1726853783.38746: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853783.39259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853783.39263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853783.39267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853783.39270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853783.39322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853783.39325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853783.39328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853783.39411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853783.41180: stdout chunk (state=3): >>>/root <<< 30583 1726853783.41275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853783.41306: stderr chunk (state=3): >>><<< 30583 1726853783.41309: stdout chunk (state=3): >>><<< 30583 1726853783.41330: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853783.41341: _low_level_execute_command(): starting 30583 1726853783.41348: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831 `" && echo ansible-tmp-1726853783.4133017-35890-257340076982831="` echo /root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831 `" ) && sleep 0' 30583 1726853783.41794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853783.41797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853783.41807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853783.41809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853783.41811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853783.41857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853783.41867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853783.41870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853783.41933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853783.43942: stdout chunk (state=3): >>>ansible-tmp-1726853783.4133017-35890-257340076982831=/root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831 <<< 30583 1726853783.44048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853783.44079: stderr chunk (state=3): >>><<< 30583 1726853783.44082: stdout chunk (state=3): >>><<< 30583 1726853783.44098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853783.4133017-35890-257340076982831=/root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853783.44140: variable 'ansible_module_compression' from source: unknown 30583 1726853783.44175: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853783.44203: variable 'ansible_facts' from source: unknown 30583 1726853783.44262: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831/AnsiballZ_ping.py 30583 1726853783.44361: Sending initial data 30583 1726853783.44364: Sent initial data (153 bytes) 30583 1726853783.44807: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853783.44810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853783.44813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853783.44815: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853783.44817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853783.44819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853783.44856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853783.44868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853783.44955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853783.46621: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853783.46694: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853783.46756: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpmdxr836p /root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831/AnsiballZ_ping.py <<< 30583 1726853783.46763: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831/AnsiballZ_ping.py" <<< 30583 1726853783.46825: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpmdxr836p" to remote "/root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831/AnsiballZ_ping.py" <<< 30583 1726853783.46828: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831/AnsiballZ_ping.py" <<< 30583 1726853783.47468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853783.47507: stderr chunk (state=3): >>><<< 30583 1726853783.47510: stdout chunk (state=3): >>><<< 30583 1726853783.47548: done transferring module to remote 30583 1726853783.47561: _low_level_execute_command(): starting 30583 1726853783.47564: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831/ /root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831/AnsiballZ_ping.py && sleep 0' 30583 1726853783.48010: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853783.48013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853783.48015: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853783.48017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853783.48023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853783.48074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853783.48078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853783.48080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853783.48151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853783.50027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853783.50052: stderr chunk (state=3): >>><<< 30583 1726853783.50056: stdout chunk (state=3): >>><<< 30583 1726853783.50074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853783.50077: _low_level_execute_command(): starting 30583 1726853783.50082: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831/AnsiballZ_ping.py && sleep 0' 30583 1726853783.50511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853783.50514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853783.50516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853783.50518: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853783.50520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853783.50570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853783.50578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853783.50654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853783.66255: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853783.67669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853783.67702: stderr chunk (state=3): >>><<< 30583 1726853783.67705: stdout chunk (state=3): >>><<< 30583 1726853783.67720: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853783.67743: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853783.67752: _low_level_execute_command(): starting 30583 1726853783.67757: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853783.4133017-35890-257340076982831/ > /dev/null 2>&1 && sleep 0' 30583 1726853783.68220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853783.68224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853783.68226: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853783.68228: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853783.68230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853783.68277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853783.68298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853783.68305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853783.68365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853783.70295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853783.70311: stderr chunk (state=3): >>><<< 30583 1726853783.70314: stdout chunk (state=3): >>><<< 30583 1726853783.70328: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853783.70334: handler run complete 30583 1726853783.70348: attempt loop complete, returning result 30583 1726853783.70351: _execute() done 30583 1726853783.70353: dumping result to json 30583 1726853783.70355: done dumping result, returning 30583 1726853783.70365: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-00000000233d] 30583 1726853783.70368: sending task result for task 02083763-bbaf-05ea-abc5-00000000233d 30583 1726853783.70462: done sending task result for task 02083763-bbaf-05ea-abc5-00000000233d 30583 1726853783.70465: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853783.70536: no more pending results, returning what we have 30583 1726853783.70540: results queue empty 30583 1726853783.70540: checking for any_errors_fatal 30583 1726853783.70548: done checking for any_errors_fatal 30583 1726853783.70549: checking for max_fail_percentage 30583 1726853783.70551: done checking for max_fail_percentage 30583 1726853783.70552: checking to see if all hosts have failed and the running result is not ok 30583 1726853783.70553: done checking to see if all hosts have failed 30583 1726853783.70553: getting the remaining hosts for this loop 30583 1726853783.70555: done getting the remaining hosts for this loop 30583 1726853783.70561: getting the next task for host managed_node2 30583 1726853783.70578: done getting next task for host managed_node2 30583 1726853783.70581: ^ task is: TASK: meta (role_complete) 30583 1726853783.70585: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853783.70600: getting variables 30583 1726853783.70602: in VariableManager get_vars() 30583 1726853783.70649: Calling all_inventory to load vars for managed_node2 30583 1726853783.70652: Calling groups_inventory to load vars for managed_node2 30583 1726853783.70654: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853783.70666: Calling all_plugins_play to load vars for managed_node2 30583 1726853783.70668: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853783.70673: Calling groups_plugins_play to load vars for managed_node2 30583 1726853783.72278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853783.77355: done with get_vars() 30583 1726853783.77380: done getting variables 30583 1726853783.77432: done queuing things up, now waiting for results queue to drain 30583 1726853783.77434: results queue empty 30583 1726853783.77435: checking for any_errors_fatal 30583 1726853783.77437: done checking for any_errors_fatal 30583 1726853783.77438: checking for max_fail_percentage 30583 1726853783.77439: done checking for max_fail_percentage 30583 1726853783.77439: checking to see if all hosts have failed and the running result is not ok 30583 1726853783.77439: done checking to see if all hosts have failed 30583 1726853783.77440: getting the remaining hosts for this loop 30583 1726853783.77440: done getting the remaining hosts for this loop 30583 1726853783.77443: getting the next task for host managed_node2 30583 1726853783.77447: done getting next task for host managed_node2 30583 1726853783.77448: ^ task is: TASK: Include network role 30583 1726853783.77450: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853783.77451: getting variables 30583 1726853783.77452: in VariableManager get_vars() 30583 1726853783.77462: Calling all_inventory to load vars for managed_node2 30583 1726853783.77464: Calling groups_inventory to load vars for managed_node2 30583 1726853783.77465: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853783.77469: Calling all_plugins_play to load vars for managed_node2 30583 1726853783.77470: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853783.77474: Calling groups_plugins_play to load vars for managed_node2 30583 1726853783.78090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853783.78928: done with get_vars() 30583 1726853783.78942: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 13:36:23 -0400 (0:00:00.417) 0:01:59.127 ****** 30583 1726853783.78995: entering _queue_task() for managed_node2/include_role 30583 1726853783.79278: worker is 1 (out of 1 available) 30583 1726853783.79292: exiting _queue_task() for managed_node2/include_role 30583 1726853783.79305: done queuing things up, now waiting for results queue to drain 30583 1726853783.79306: waiting for pending results... 30583 1726853783.79504: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853783.79598: in run() - task 02083763-bbaf-05ea-abc5-000000002142 30583 1726853783.79609: variable 'ansible_search_path' from source: unknown 30583 1726853783.79613: variable 'ansible_search_path' from source: unknown 30583 1726853783.79644: calling self._execute() 30583 1726853783.79731: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.79735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.79741: variable 'omit' from source: magic vars 30583 1726853783.80040: variable 'ansible_distribution_major_version' from source: facts 30583 1726853783.80050: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853783.80055: _execute() done 30583 1726853783.80061: dumping result to json 30583 1726853783.80064: done dumping result, returning 30583 1726853783.80074: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-000000002142] 30583 1726853783.80077: sending task result for task 02083763-bbaf-05ea-abc5-000000002142 30583 1726853783.80180: done sending task result for task 02083763-bbaf-05ea-abc5-000000002142 30583 1726853783.80183: WORKER PROCESS EXITING 30583 1726853783.80213: no more pending results, returning what we have 30583 1726853783.80218: in VariableManager get_vars() 30583 1726853783.80265: Calling all_inventory to load vars for managed_node2 30583 1726853783.80268: Calling groups_inventory to load vars for managed_node2 30583 1726853783.80273: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853783.80285: Calling all_plugins_play to load vars for managed_node2 30583 1726853783.80288: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853783.80291: Calling groups_plugins_play to load vars for managed_node2 30583 1726853783.81213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853783.82061: done with get_vars() 30583 1726853783.82077: variable 'ansible_search_path' from source: unknown 30583 1726853783.82078: variable 'ansible_search_path' from source: unknown 30583 1726853783.82178: variable 'omit' from source: magic vars 30583 1726853783.82206: variable 'omit' from source: magic vars 30583 1726853783.82216: variable 'omit' from source: magic vars 30583 1726853783.82219: we have included files to process 30583 1726853783.82219: generating all_blocks data 30583 1726853783.82221: done generating all_blocks data 30583 1726853783.82224: processing included file: fedora.linux_system_roles.network 30583 1726853783.82238: in VariableManager get_vars() 30583 1726853783.82248: done with get_vars() 30583 1726853783.82269: in VariableManager get_vars() 30583 1726853783.82283: done with get_vars() 30583 1726853783.82309: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853783.82386: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853783.82433: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853783.82704: in VariableManager get_vars() 30583 1726853783.82718: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853783.83942: iterating over new_blocks loaded from include file 30583 1726853783.83944: in VariableManager get_vars() 30583 1726853783.83957: done with get_vars() 30583 1726853783.83959: filtering new block on tags 30583 1726853783.84115: done filtering new block on tags 30583 1726853783.84118: in VariableManager get_vars() 30583 1726853783.84128: done with get_vars() 30583 1726853783.84129: filtering new block on tags 30583 1726853783.84139: done filtering new block on tags 30583 1726853783.84140: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853783.84144: extending task lists for all hosts with included blocks 30583 1726853783.84211: done extending task lists 30583 1726853783.84212: done processing included files 30583 1726853783.84213: results queue empty 30583 1726853783.84213: checking for any_errors_fatal 30583 1726853783.84214: done checking for any_errors_fatal 30583 1726853783.84215: checking for max_fail_percentage 30583 1726853783.84215: done checking for max_fail_percentage 30583 1726853783.84216: checking to see if all hosts have failed and the running result is not ok 30583 1726853783.84217: done checking to see if all hosts have failed 30583 1726853783.84217: getting the remaining hosts for this loop 30583 1726853783.84218: done getting the remaining hosts for this loop 30583 1726853783.84220: getting the next task for host managed_node2 30583 1726853783.84222: done getting next task for host managed_node2 30583 1726853783.84224: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853783.84226: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853783.84236: getting variables 30583 1726853783.84236: in VariableManager get_vars() 30583 1726853783.84246: Calling all_inventory to load vars for managed_node2 30583 1726853783.84247: Calling groups_inventory to load vars for managed_node2 30583 1726853783.84248: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853783.84252: Calling all_plugins_play to load vars for managed_node2 30583 1726853783.84253: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853783.84255: Calling groups_plugins_play to load vars for managed_node2 30583 1726853783.84969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853783.85819: done with get_vars() 30583 1726853783.85833: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:36:23 -0400 (0:00:00.068) 0:01:59.196 ****** 30583 1726853783.85885: entering _queue_task() for managed_node2/include_tasks 30583 1726853783.86154: worker is 1 (out of 1 available) 30583 1726853783.86168: exiting _queue_task() for managed_node2/include_tasks 30583 1726853783.86182: done queuing things up, now waiting for results queue to drain 30583 1726853783.86183: waiting for pending results... 30583 1726853783.86385: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853783.86483: in run() - task 02083763-bbaf-05ea-abc5-0000000024a4 30583 1726853783.86495: variable 'ansible_search_path' from source: unknown 30583 1726853783.86499: variable 'ansible_search_path' from source: unknown 30583 1726853783.86529: calling self._execute() 30583 1726853783.86604: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.86608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.86617: variable 'omit' from source: magic vars 30583 1726853783.86902: variable 'ansible_distribution_major_version' from source: facts 30583 1726853783.86911: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853783.86917: _execute() done 30583 1726853783.86919: dumping result to json 30583 1726853783.86923: done dumping result, returning 30583 1726853783.86929: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-0000000024a4] 30583 1726853783.86934: sending task result for task 02083763-bbaf-05ea-abc5-0000000024a4 30583 1726853783.87017: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024a4 30583 1726853783.87019: WORKER PROCESS EXITING 30583 1726853783.87110: no more pending results, returning what we have 30583 1726853783.87115: in VariableManager get_vars() 30583 1726853783.87162: Calling all_inventory to load vars for managed_node2 30583 1726853783.87165: Calling groups_inventory to load vars for managed_node2 30583 1726853783.87167: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853783.87180: Calling all_plugins_play to load vars for managed_node2 30583 1726853783.87183: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853783.87185: Calling groups_plugins_play to load vars for managed_node2 30583 1726853783.87962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853783.88832: done with get_vars() 30583 1726853783.88846: variable 'ansible_search_path' from source: unknown 30583 1726853783.88847: variable 'ansible_search_path' from source: unknown 30583 1726853783.88875: we have included files to process 30583 1726853783.88875: generating all_blocks data 30583 1726853783.88877: done generating all_blocks data 30583 1726853783.88879: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853783.88880: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853783.88881: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853783.89246: done processing included file 30583 1726853783.89248: iterating over new_blocks loaded from include file 30583 1726853783.89249: in VariableManager get_vars() 30583 1726853783.89267: done with get_vars() 30583 1726853783.89268: filtering new block on tags 30583 1726853783.89289: done filtering new block on tags 30583 1726853783.89291: in VariableManager get_vars() 30583 1726853783.89304: done with get_vars() 30583 1726853783.89305: filtering new block on tags 30583 1726853783.89330: done filtering new block on tags 30583 1726853783.89332: in VariableManager get_vars() 30583 1726853783.89349: done with get_vars() 30583 1726853783.89350: filtering new block on tags 30583 1726853783.89376: done filtering new block on tags 30583 1726853783.89377: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853783.89381: extending task lists for all hosts with included blocks 30583 1726853783.90328: done extending task lists 30583 1726853783.90329: done processing included files 30583 1726853783.90330: results queue empty 30583 1726853783.90330: checking for any_errors_fatal 30583 1726853783.90332: done checking for any_errors_fatal 30583 1726853783.90333: checking for max_fail_percentage 30583 1726853783.90334: done checking for max_fail_percentage 30583 1726853783.90334: checking to see if all hosts have failed and the running result is not ok 30583 1726853783.90335: done checking to see if all hosts have failed 30583 1726853783.90335: getting the remaining hosts for this loop 30583 1726853783.90336: done getting the remaining hosts for this loop 30583 1726853783.90338: getting the next task for host managed_node2 30583 1726853783.90341: done getting next task for host managed_node2 30583 1726853783.90343: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853783.90345: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853783.90353: getting variables 30583 1726853783.90354: in VariableManager get_vars() 30583 1726853783.90364: Calling all_inventory to load vars for managed_node2 30583 1726853783.90365: Calling groups_inventory to load vars for managed_node2 30583 1726853783.90367: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853783.90372: Calling all_plugins_play to load vars for managed_node2 30583 1726853783.90374: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853783.90375: Calling groups_plugins_play to load vars for managed_node2 30583 1726853783.91035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853783.91881: done with get_vars() 30583 1726853783.91897: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:36:23 -0400 (0:00:00.060) 0:01:59.256 ****** 30583 1726853783.91948: entering _queue_task() for managed_node2/setup 30583 1726853783.92213: worker is 1 (out of 1 available) 30583 1726853783.92227: exiting _queue_task() for managed_node2/setup 30583 1726853783.92241: done queuing things up, now waiting for results queue to drain 30583 1726853783.92242: waiting for pending results... 30583 1726853783.92434: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853783.92539: in run() - task 02083763-bbaf-05ea-abc5-0000000024fb 30583 1726853783.92549: variable 'ansible_search_path' from source: unknown 30583 1726853783.92554: variable 'ansible_search_path' from source: unknown 30583 1726853783.92588: calling self._execute() 30583 1726853783.92663: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.92670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.92682: variable 'omit' from source: magic vars 30583 1726853783.92975: variable 'ansible_distribution_major_version' from source: facts 30583 1726853783.92984: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853783.93135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853783.94635: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853783.94688: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853783.94715: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853783.94741: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853783.94764: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853783.94821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853783.94841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853783.94864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853783.94891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853783.94901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853783.94938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853783.94954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853783.94976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853783.95001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853783.95012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853783.95122: variable '__network_required_facts' from source: role '' defaults 30583 1726853783.95129: variable 'ansible_facts' from source: unknown 30583 1726853783.95597: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853783.95600: when evaluation is False, skipping this task 30583 1726853783.95603: _execute() done 30583 1726853783.95605: dumping result to json 30583 1726853783.95607: done dumping result, returning 30583 1726853783.95617: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-0000000024fb] 30583 1726853783.95619: sending task result for task 02083763-bbaf-05ea-abc5-0000000024fb 30583 1726853783.95703: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024fb 30583 1726853783.95706: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853783.95774: no more pending results, returning what we have 30583 1726853783.95777: results queue empty 30583 1726853783.95778: checking for any_errors_fatal 30583 1726853783.95780: done checking for any_errors_fatal 30583 1726853783.95780: checking for max_fail_percentage 30583 1726853783.95782: done checking for max_fail_percentage 30583 1726853783.95783: checking to see if all hosts have failed and the running result is not ok 30583 1726853783.95784: done checking to see if all hosts have failed 30583 1726853783.95784: getting the remaining hosts for this loop 30583 1726853783.95786: done getting the remaining hosts for this loop 30583 1726853783.95789: getting the next task for host managed_node2 30583 1726853783.95800: done getting next task for host managed_node2 30583 1726853783.95804: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853783.95810: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853783.95839: getting variables 30583 1726853783.95840: in VariableManager get_vars() 30583 1726853783.95885: Calling all_inventory to load vars for managed_node2 30583 1726853783.95888: Calling groups_inventory to load vars for managed_node2 30583 1726853783.95890: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853783.95899: Calling all_plugins_play to load vars for managed_node2 30583 1726853783.95902: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853783.95910: Calling groups_plugins_play to load vars for managed_node2 30583 1726853783.96715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853783.97707: done with get_vars() 30583 1726853783.97723: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:36:23 -0400 (0:00:00.058) 0:01:59.315 ****** 30583 1726853783.97793: entering _queue_task() for managed_node2/stat 30583 1726853783.98031: worker is 1 (out of 1 available) 30583 1726853783.98045: exiting _queue_task() for managed_node2/stat 30583 1726853783.98058: done queuing things up, now waiting for results queue to drain 30583 1726853783.98060: waiting for pending results... 30583 1726853783.98255: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853783.98362: in run() - task 02083763-bbaf-05ea-abc5-0000000024fd 30583 1726853783.98378: variable 'ansible_search_path' from source: unknown 30583 1726853783.98382: variable 'ansible_search_path' from source: unknown 30583 1726853783.98413: calling self._execute() 30583 1726853783.98490: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853783.98495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853783.98505: variable 'omit' from source: magic vars 30583 1726853783.98794: variable 'ansible_distribution_major_version' from source: facts 30583 1726853783.98802: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853783.98922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853783.99125: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853783.99156: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853783.99191: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853783.99216: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853783.99284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853783.99301: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853783.99319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853783.99336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853783.99413: variable '__network_is_ostree' from source: set_fact 30583 1726853783.99419: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853783.99422: when evaluation is False, skipping this task 30583 1726853783.99424: _execute() done 30583 1726853783.99427: dumping result to json 30583 1726853783.99430: done dumping result, returning 30583 1726853783.99438: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-0000000024fd] 30583 1726853783.99443: sending task result for task 02083763-bbaf-05ea-abc5-0000000024fd 30583 1726853783.99529: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024fd 30583 1726853783.99531: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853783.99586: no more pending results, returning what we have 30583 1726853783.99590: results queue empty 30583 1726853783.99591: checking for any_errors_fatal 30583 1726853783.99599: done checking for any_errors_fatal 30583 1726853783.99600: checking for max_fail_percentage 30583 1726853783.99602: done checking for max_fail_percentage 30583 1726853783.99603: checking to see if all hosts have failed and the running result is not ok 30583 1726853783.99604: done checking to see if all hosts have failed 30583 1726853783.99604: getting the remaining hosts for this loop 30583 1726853783.99607: done getting the remaining hosts for this loop 30583 1726853783.99610: getting the next task for host managed_node2 30583 1726853783.99618: done getting next task for host managed_node2 30583 1726853783.99621: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853783.99627: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853783.99660: getting variables 30583 1726853783.99662: in VariableManager get_vars() 30583 1726853783.99704: Calling all_inventory to load vars for managed_node2 30583 1726853783.99707: Calling groups_inventory to load vars for managed_node2 30583 1726853783.99709: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853783.99718: Calling all_plugins_play to load vars for managed_node2 30583 1726853783.99721: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853783.99724: Calling groups_plugins_play to load vars for managed_node2 30583 1726853784.00536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853784.01415: done with get_vars() 30583 1726853784.01432: done getting variables 30583 1726853784.01481: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:36:24 -0400 (0:00:00.037) 0:01:59.352 ****** 30583 1726853784.01510: entering _queue_task() for managed_node2/set_fact 30583 1726853784.01765: worker is 1 (out of 1 available) 30583 1726853784.01780: exiting _queue_task() for managed_node2/set_fact 30583 1726853784.01792: done queuing things up, now waiting for results queue to drain 30583 1726853784.01794: waiting for pending results... 30583 1726853784.01987: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853784.02087: in run() - task 02083763-bbaf-05ea-abc5-0000000024fe 30583 1726853784.02098: variable 'ansible_search_path' from source: unknown 30583 1726853784.02102: variable 'ansible_search_path' from source: unknown 30583 1726853784.02132: calling self._execute() 30583 1726853784.02209: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853784.02213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853784.02222: variable 'omit' from source: magic vars 30583 1726853784.02509: variable 'ansible_distribution_major_version' from source: facts 30583 1726853784.02517: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853784.02634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853784.02832: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853784.02862: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853784.02896: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853784.02919: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853784.02983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853784.03004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853784.03021: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853784.03040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853784.03108: variable '__network_is_ostree' from source: set_fact 30583 1726853784.03111: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853784.03114: when evaluation is False, skipping this task 30583 1726853784.03116: _execute() done 30583 1726853784.03124: dumping result to json 30583 1726853784.03126: done dumping result, returning 30583 1726853784.03130: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-0000000024fe] 30583 1726853784.03137: sending task result for task 02083763-bbaf-05ea-abc5-0000000024fe 30583 1726853784.03219: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024fe 30583 1726853784.03222: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853784.03275: no more pending results, returning what we have 30583 1726853784.03278: results queue empty 30583 1726853784.03279: checking for any_errors_fatal 30583 1726853784.03287: done checking for any_errors_fatal 30583 1726853784.03288: checking for max_fail_percentage 30583 1726853784.03290: done checking for max_fail_percentage 30583 1726853784.03290: checking to see if all hosts have failed and the running result is not ok 30583 1726853784.03291: done checking to see if all hosts have failed 30583 1726853784.03292: getting the remaining hosts for this loop 30583 1726853784.03294: done getting the remaining hosts for this loop 30583 1726853784.03297: getting the next task for host managed_node2 30583 1726853784.03309: done getting next task for host managed_node2 30583 1726853784.03312: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853784.03318: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853784.03345: getting variables 30583 1726853784.03347: in VariableManager get_vars() 30583 1726853784.03390: Calling all_inventory to load vars for managed_node2 30583 1726853784.03393: Calling groups_inventory to load vars for managed_node2 30583 1726853784.03395: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853784.03403: Calling all_plugins_play to load vars for managed_node2 30583 1726853784.03406: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853784.03409: Calling groups_plugins_play to load vars for managed_node2 30583 1726853784.04362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853784.05226: done with get_vars() 30583 1726853784.05243: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:36:24 -0400 (0:00:00.038) 0:01:59.390 ****** 30583 1726853784.05318: entering _queue_task() for managed_node2/service_facts 30583 1726853784.05576: worker is 1 (out of 1 available) 30583 1726853784.05589: exiting _queue_task() for managed_node2/service_facts 30583 1726853784.05603: done queuing things up, now waiting for results queue to drain 30583 1726853784.05604: waiting for pending results... 30583 1726853784.05796: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853784.05908: in run() - task 02083763-bbaf-05ea-abc5-000000002500 30583 1726853784.05919: variable 'ansible_search_path' from source: unknown 30583 1726853784.05923: variable 'ansible_search_path' from source: unknown 30583 1726853784.05954: calling self._execute() 30583 1726853784.06028: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853784.06032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853784.06042: variable 'omit' from source: magic vars 30583 1726853784.06327: variable 'ansible_distribution_major_version' from source: facts 30583 1726853784.06337: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853784.06342: variable 'omit' from source: magic vars 30583 1726853784.06402: variable 'omit' from source: magic vars 30583 1726853784.06424: variable 'omit' from source: magic vars 30583 1726853784.06460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853784.06489: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853784.06507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853784.06520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853784.06530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853784.06554: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853784.06557: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853784.06562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853784.06633: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853784.06637: Set connection var ansible_timeout to 10 30583 1726853784.06640: Set connection var ansible_connection to ssh 30583 1726853784.06646: Set connection var ansible_shell_executable to /bin/sh 30583 1726853784.06648: Set connection var ansible_shell_type to sh 30583 1726853784.06655: Set connection var ansible_pipelining to False 30583 1726853784.06675: variable 'ansible_shell_executable' from source: unknown 30583 1726853784.06678: variable 'ansible_connection' from source: unknown 30583 1726853784.06682: variable 'ansible_module_compression' from source: unknown 30583 1726853784.06684: variable 'ansible_shell_type' from source: unknown 30583 1726853784.06686: variable 'ansible_shell_executable' from source: unknown 30583 1726853784.06689: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853784.06691: variable 'ansible_pipelining' from source: unknown 30583 1726853784.06693: variable 'ansible_timeout' from source: unknown 30583 1726853784.06703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853784.06840: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853784.06847: variable 'omit' from source: magic vars 30583 1726853784.06852: starting attempt loop 30583 1726853784.06855: running the handler 30583 1726853784.06867: _low_level_execute_command(): starting 30583 1726853784.06875: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853784.07391: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853784.07395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853784.07398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853784.07400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853784.07449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853784.07452: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853784.07455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853784.07538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853784.09286: stdout chunk (state=3): >>>/root <<< 30583 1726853784.09382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853784.09411: stderr chunk (state=3): >>><<< 30583 1726853784.09414: stdout chunk (state=3): >>><<< 30583 1726853784.09437: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853784.09448: _low_level_execute_command(): starting 30583 1726853784.09454: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075 `" && echo ansible-tmp-1726853784.0943685-35906-157111516454075="` echo /root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075 `" ) && sleep 0' 30583 1726853784.09897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853784.09901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853784.09904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853784.09913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853784.09916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853784.09956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853784.09963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853784.09964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853784.10032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853784.12046: stdout chunk (state=3): >>>ansible-tmp-1726853784.0943685-35906-157111516454075=/root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075 <<< 30583 1726853784.12145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853784.12182: stderr chunk (state=3): >>><<< 30583 1726853784.12185: stdout chunk (state=3): >>><<< 30583 1726853784.12200: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853784.0943685-35906-157111516454075=/root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853784.12240: variable 'ansible_module_compression' from source: unknown 30583 1726853784.12282: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853784.12313: variable 'ansible_facts' from source: unknown 30583 1726853784.12374: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075/AnsiballZ_service_facts.py 30583 1726853784.12478: Sending initial data 30583 1726853784.12482: Sent initial data (162 bytes) 30583 1726853784.12926: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853784.12929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853784.12931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853784.12934: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853784.12936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853784.12991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853784.12997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853784.12999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853784.13069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853784.14721: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853784.14725: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853784.14787: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853784.14856: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpc77ortr5 /root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075/AnsiballZ_service_facts.py <<< 30583 1726853784.14863: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075/AnsiballZ_service_facts.py" <<< 30583 1726853784.14923: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpc77ortr5" to remote "/root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075/AnsiballZ_service_facts.py" <<< 30583 1726853784.14927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075/AnsiballZ_service_facts.py" <<< 30583 1726853784.15553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853784.15594: stderr chunk (state=3): >>><<< 30583 1726853784.15597: stdout chunk (state=3): >>><<< 30583 1726853784.15660: done transferring module to remote 30583 1726853784.15668: _low_level_execute_command(): starting 30583 1726853784.15674: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075/ /root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075/AnsiballZ_service_facts.py && sleep 0' 30583 1726853784.16110: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853784.16113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853784.16116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853784.16118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853784.16124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853784.16175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853784.16178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853784.16185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853784.16266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853784.18143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853784.18167: stderr chunk (state=3): >>><<< 30583 1726853784.18173: stdout chunk (state=3): >>><<< 30583 1726853784.18183: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853784.18187: _low_level_execute_command(): starting 30583 1726853784.18191: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075/AnsiballZ_service_facts.py && sleep 0' 30583 1726853784.18614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853784.18617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853784.18619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853784.18621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853784.18623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853784.18679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853784.18685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853784.18764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853785.83727: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30583 1726853785.83744: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30583 1726853785.83758: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 30583 1726853785.83797: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 30583 1726853785.83805: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853785.85376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853785.85405: stderr chunk (state=3): >>><<< 30583 1726853785.85408: stdout chunk (state=3): >>><<< 30583 1726853785.85441: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853785.85883: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853785.85892: _low_level_execute_command(): starting 30583 1726853785.85897: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853784.0943685-35906-157111516454075/ > /dev/null 2>&1 && sleep 0' 30583 1726853785.86344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853785.86348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853785.86350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853785.86352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853785.86354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853785.86356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853785.86409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853785.86416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853785.86418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853785.86485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853785.88412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853785.88437: stderr chunk (state=3): >>><<< 30583 1726853785.88442: stdout chunk (state=3): >>><<< 30583 1726853785.88456: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853785.88461: handler run complete 30583 1726853785.88580: variable 'ansible_facts' from source: unknown 30583 1726853785.88682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853785.88962: variable 'ansible_facts' from source: unknown 30583 1726853785.89043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853785.89161: attempt loop complete, returning result 30583 1726853785.89164: _execute() done 30583 1726853785.89166: dumping result to json 30583 1726853785.89204: done dumping result, returning 30583 1726853785.89213: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-000000002500] 30583 1726853785.89218: sending task result for task 02083763-bbaf-05ea-abc5-000000002500 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853785.89840: no more pending results, returning what we have 30583 1726853785.89842: results queue empty 30583 1726853785.89843: checking for any_errors_fatal 30583 1726853785.89848: done checking for any_errors_fatal 30583 1726853785.89849: checking for max_fail_percentage 30583 1726853785.89850: done checking for max_fail_percentage 30583 1726853785.89851: checking to see if all hosts have failed and the running result is not ok 30583 1726853785.89852: done checking to see if all hosts have failed 30583 1726853785.89852: getting the remaining hosts for this loop 30583 1726853785.89854: done getting the remaining hosts for this loop 30583 1726853785.89857: getting the next task for host managed_node2 30583 1726853785.89865: done getting next task for host managed_node2 30583 1726853785.89868: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853785.89878: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853785.89891: getting variables 30583 1726853785.89892: in VariableManager get_vars() 30583 1726853785.89918: Calling all_inventory to load vars for managed_node2 30583 1726853785.89920: Calling groups_inventory to load vars for managed_node2 30583 1726853785.89921: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853785.89929: Calling all_plugins_play to load vars for managed_node2 30583 1726853785.89931: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853785.89933: Calling groups_plugins_play to load vars for managed_node2 30583 1726853785.89939: done sending task result for task 02083763-bbaf-05ea-abc5-000000002500 30583 1726853785.90460: WORKER PROCESS EXITING 30583 1726853785.90791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853785.91665: done with get_vars() 30583 1726853785.91683: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:36:25 -0400 (0:00:01.864) 0:02:01.254 ****** 30583 1726853785.91753: entering _queue_task() for managed_node2/package_facts 30583 1726853785.92004: worker is 1 (out of 1 available) 30583 1726853785.92019: exiting _queue_task() for managed_node2/package_facts 30583 1726853785.92032: done queuing things up, now waiting for results queue to drain 30583 1726853785.92034: waiting for pending results... 30583 1726853785.92224: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853785.92325: in run() - task 02083763-bbaf-05ea-abc5-000000002501 30583 1726853785.92337: variable 'ansible_search_path' from source: unknown 30583 1726853785.92341: variable 'ansible_search_path' from source: unknown 30583 1726853785.92370: calling self._execute() 30583 1726853785.92452: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853785.92467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853785.92479: variable 'omit' from source: magic vars 30583 1726853785.92757: variable 'ansible_distribution_major_version' from source: facts 30583 1726853785.92769: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853785.92777: variable 'omit' from source: magic vars 30583 1726853785.92831: variable 'omit' from source: magic vars 30583 1726853785.92854: variable 'omit' from source: magic vars 30583 1726853785.92889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853785.92917: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853785.92934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853785.92947: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853785.92956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853785.92984: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853785.92987: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853785.92990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853785.93060: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853785.93068: Set connection var ansible_timeout to 10 30583 1726853785.93072: Set connection var ansible_connection to ssh 30583 1726853785.93078: Set connection var ansible_shell_executable to /bin/sh 30583 1726853785.93080: Set connection var ansible_shell_type to sh 30583 1726853785.93088: Set connection var ansible_pipelining to False 30583 1726853785.93106: variable 'ansible_shell_executable' from source: unknown 30583 1726853785.93109: variable 'ansible_connection' from source: unknown 30583 1726853785.93112: variable 'ansible_module_compression' from source: unknown 30583 1726853785.93114: variable 'ansible_shell_type' from source: unknown 30583 1726853785.93116: variable 'ansible_shell_executable' from source: unknown 30583 1726853785.93118: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853785.93120: variable 'ansible_pipelining' from source: unknown 30583 1726853785.93125: variable 'ansible_timeout' from source: unknown 30583 1726853785.93127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853785.93272: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853785.93280: variable 'omit' from source: magic vars 30583 1726853785.93285: starting attempt loop 30583 1726853785.93288: running the handler 30583 1726853785.93299: _low_level_execute_command(): starting 30583 1726853785.93305: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853785.93815: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853785.93819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853785.93823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853785.93825: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853785.93874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853785.93878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853785.93963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853785.95683: stdout chunk (state=3): >>>/root <<< 30583 1726853785.95778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853785.95806: stderr chunk (state=3): >>><<< 30583 1726853785.95809: stdout chunk (state=3): >>><<< 30583 1726853785.95829: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853785.95841: _low_level_execute_command(): starting 30583 1726853785.95848: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808 `" && echo ansible-tmp-1726853785.9582818-35920-174142605930808="` echo /root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808 `" ) && sleep 0' 30583 1726853785.96290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853785.96293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853785.96296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853785.96305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853785.96308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853785.96310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853785.96351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853785.96354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853785.96359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853785.96428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853785.98412: stdout chunk (state=3): >>>ansible-tmp-1726853785.9582818-35920-174142605930808=/root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808 <<< 30583 1726853785.98518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853785.98544: stderr chunk (state=3): >>><<< 30583 1726853785.98547: stdout chunk (state=3): >>><<< 30583 1726853785.98567: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853785.9582818-35920-174142605930808=/root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853785.98606: variable 'ansible_module_compression' from source: unknown 30583 1726853785.98644: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853785.98700: variable 'ansible_facts' from source: unknown 30583 1726853785.98819: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808/AnsiballZ_package_facts.py 30583 1726853785.98922: Sending initial data 30583 1726853785.98925: Sent initial data (162 bytes) 30583 1726853785.99363: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853785.99366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853785.99368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853785.99372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853785.99375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853785.99428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853785.99434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853785.99441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853785.99509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853786.01151: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853786.01156: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853786.01216: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853786.01290: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpyd17azxu /root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808/AnsiballZ_package_facts.py <<< 30583 1726853786.01295: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808/AnsiballZ_package_facts.py" <<< 30583 1726853786.01359: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpyd17azxu" to remote "/root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808/AnsiballZ_package_facts.py" <<< 30583 1726853786.01361: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808/AnsiballZ_package_facts.py" <<< 30583 1726853786.02583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853786.02620: stderr chunk (state=3): >>><<< 30583 1726853786.02624: stdout chunk (state=3): >>><<< 30583 1726853786.02643: done transferring module to remote 30583 1726853786.02652: _low_level_execute_command(): starting 30583 1726853786.02657: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808/ /root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808/AnsiballZ_package_facts.py && sleep 0' 30583 1726853786.03096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853786.03099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853786.03102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853786.03110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853786.03112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853786.03114: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853786.03158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853786.03161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853786.03166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853786.03234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853786.05100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853786.05123: stderr chunk (state=3): >>><<< 30583 1726853786.05127: stdout chunk (state=3): >>><<< 30583 1726853786.05139: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853786.05143: _low_level_execute_command(): starting 30583 1726853786.05150: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808/AnsiballZ_package_facts.py && sleep 0' 30583 1726853786.05590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853786.05593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853786.05595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853786.05597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853786.05599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853786.05601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853786.05647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853786.05650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853786.05732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853786.51002: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30583 1726853786.51036: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30583 1726853786.51057: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30583 1726853786.51078: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30583 1726853786.51104: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30583 1726853786.51146: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30583 1726853786.51154: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30583 1726853786.51157: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30583 1726853786.51191: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30583 1726853786.51207: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30583 1726853786.51216: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853786.53045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853786.53076: stderr chunk (state=3): >>><<< 30583 1726853786.53079: stdout chunk (state=3): >>><<< 30583 1726853786.53120: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853786.54428: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853786.54445: _low_level_execute_command(): starting 30583 1726853786.54450: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853785.9582818-35920-174142605930808/ > /dev/null 2>&1 && sleep 0' 30583 1726853786.54909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853786.54912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853786.54915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853786.54917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853786.54919: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853786.54969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853786.54978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853786.54980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853786.55048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853786.56932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853786.56960: stderr chunk (state=3): >>><<< 30583 1726853786.56963: stdout chunk (state=3): >>><<< 30583 1726853786.56974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853786.56980: handler run complete 30583 1726853786.57420: variable 'ansible_facts' from source: unknown 30583 1726853786.57685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853786.58725: variable 'ansible_facts' from source: unknown 30583 1726853786.58964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853786.59342: attempt loop complete, returning result 30583 1726853786.59351: _execute() done 30583 1726853786.59354: dumping result to json 30583 1726853786.59469: done dumping result, returning 30583 1726853786.59478: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-000000002501] 30583 1726853786.59482: sending task result for task 02083763-bbaf-05ea-abc5-000000002501 30583 1726853786.60776: done sending task result for task 02083763-bbaf-05ea-abc5-000000002501 30583 1726853786.60780: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853786.60875: no more pending results, returning what we have 30583 1726853786.60877: results queue empty 30583 1726853786.60878: checking for any_errors_fatal 30583 1726853786.60882: done checking for any_errors_fatal 30583 1726853786.60882: checking for max_fail_percentage 30583 1726853786.60883: done checking for max_fail_percentage 30583 1726853786.60884: checking to see if all hosts have failed and the running result is not ok 30583 1726853786.60884: done checking to see if all hosts have failed 30583 1726853786.60885: getting the remaining hosts for this loop 30583 1726853786.60886: done getting the remaining hosts for this loop 30583 1726853786.60889: getting the next task for host managed_node2 30583 1726853786.60894: done getting next task for host managed_node2 30583 1726853786.60896: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853786.60900: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853786.60908: getting variables 30583 1726853786.60909: in VariableManager get_vars() 30583 1726853786.60935: Calling all_inventory to load vars for managed_node2 30583 1726853786.60937: Calling groups_inventory to load vars for managed_node2 30583 1726853786.60938: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853786.60945: Calling all_plugins_play to load vars for managed_node2 30583 1726853786.60946: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853786.60948: Calling groups_plugins_play to load vars for managed_node2 30583 1726853786.61638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853786.62564: done with get_vars() 30583 1726853786.62587: done getting variables 30583 1726853786.62629: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:36:26 -0400 (0:00:00.709) 0:02:01.963 ****** 30583 1726853786.62656: entering _queue_task() for managed_node2/debug 30583 1726853786.62912: worker is 1 (out of 1 available) 30583 1726853786.62925: exiting _queue_task() for managed_node2/debug 30583 1726853786.62938: done queuing things up, now waiting for results queue to drain 30583 1726853786.62940: waiting for pending results... 30583 1726853786.63129: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853786.63228: in run() - task 02083763-bbaf-05ea-abc5-0000000024a5 30583 1726853786.63239: variable 'ansible_search_path' from source: unknown 30583 1726853786.63242: variable 'ansible_search_path' from source: unknown 30583 1726853786.63277: calling self._execute() 30583 1726853786.63353: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853786.63360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853786.63367: variable 'omit' from source: magic vars 30583 1726853786.63647: variable 'ansible_distribution_major_version' from source: facts 30583 1726853786.63657: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853786.63663: variable 'omit' from source: magic vars 30583 1726853786.63712: variable 'omit' from source: magic vars 30583 1726853786.63779: variable 'network_provider' from source: set_fact 30583 1726853786.63793: variable 'omit' from source: magic vars 30583 1726853786.63828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853786.63856: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853786.63874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853786.63888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853786.63897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853786.63922: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853786.63925: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853786.63928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853786.64000: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853786.64004: Set connection var ansible_timeout to 10 30583 1726853786.64007: Set connection var ansible_connection to ssh 30583 1726853786.64012: Set connection var ansible_shell_executable to /bin/sh 30583 1726853786.64014: Set connection var ansible_shell_type to sh 30583 1726853786.64022: Set connection var ansible_pipelining to False 30583 1726853786.64045: variable 'ansible_shell_executable' from source: unknown 30583 1726853786.64048: variable 'ansible_connection' from source: unknown 30583 1726853786.64051: variable 'ansible_module_compression' from source: unknown 30583 1726853786.64053: variable 'ansible_shell_type' from source: unknown 30583 1726853786.64056: variable 'ansible_shell_executable' from source: unknown 30583 1726853786.64060: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853786.64063: variable 'ansible_pipelining' from source: unknown 30583 1726853786.64064: variable 'ansible_timeout' from source: unknown 30583 1726853786.64067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853786.64267: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853786.64272: variable 'omit' from source: magic vars 30583 1726853786.64274: starting attempt loop 30583 1726853786.64276: running the handler 30583 1726853786.64277: handler run complete 30583 1726853786.64279: attempt loop complete, returning result 30583 1726853786.64281: _execute() done 30583 1726853786.64283: dumping result to json 30583 1726853786.64284: done dumping result, returning 30583 1726853786.64286: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-0000000024a5] 30583 1726853786.64287: sending task result for task 02083763-bbaf-05ea-abc5-0000000024a5 30583 1726853786.64347: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024a5 30583 1726853786.64350: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853786.64436: no more pending results, returning what we have 30583 1726853786.64439: results queue empty 30583 1726853786.64440: checking for any_errors_fatal 30583 1726853786.64446: done checking for any_errors_fatal 30583 1726853786.64446: checking for max_fail_percentage 30583 1726853786.64448: done checking for max_fail_percentage 30583 1726853786.64449: checking to see if all hosts have failed and the running result is not ok 30583 1726853786.64449: done checking to see if all hosts have failed 30583 1726853786.64450: getting the remaining hosts for this loop 30583 1726853786.64452: done getting the remaining hosts for this loop 30583 1726853786.64455: getting the next task for host managed_node2 30583 1726853786.64465: done getting next task for host managed_node2 30583 1726853786.64468: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853786.64474: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853786.64485: getting variables 30583 1726853786.64487: in VariableManager get_vars() 30583 1726853786.64522: Calling all_inventory to load vars for managed_node2 30583 1726853786.64525: Calling groups_inventory to load vars for managed_node2 30583 1726853786.64527: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853786.64535: Calling all_plugins_play to load vars for managed_node2 30583 1726853786.64537: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853786.64539: Calling groups_plugins_play to load vars for managed_node2 30583 1726853786.65303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853786.66176: done with get_vars() 30583 1726853786.66198: done getting variables 30583 1726853786.66242: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:36:26 -0400 (0:00:00.036) 0:02:01.999 ****** 30583 1726853786.66276: entering _queue_task() for managed_node2/fail 30583 1726853786.66538: worker is 1 (out of 1 available) 30583 1726853786.66553: exiting _queue_task() for managed_node2/fail 30583 1726853786.66569: done queuing things up, now waiting for results queue to drain 30583 1726853786.66572: waiting for pending results... 30583 1726853786.66764: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853786.66866: in run() - task 02083763-bbaf-05ea-abc5-0000000024a6 30583 1726853786.66878: variable 'ansible_search_path' from source: unknown 30583 1726853786.66883: variable 'ansible_search_path' from source: unknown 30583 1726853786.66915: calling self._execute() 30583 1726853786.67000: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853786.67005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853786.67017: variable 'omit' from source: magic vars 30583 1726853786.67310: variable 'ansible_distribution_major_version' from source: facts 30583 1726853786.67319: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853786.67406: variable 'network_state' from source: role '' defaults 30583 1726853786.67414: Evaluated conditional (network_state != {}): False 30583 1726853786.67417: when evaluation is False, skipping this task 30583 1726853786.67420: _execute() done 30583 1726853786.67422: dumping result to json 30583 1726853786.67425: done dumping result, returning 30583 1726853786.67433: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-0000000024a6] 30583 1726853786.67437: sending task result for task 02083763-bbaf-05ea-abc5-0000000024a6 30583 1726853786.67524: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024a6 30583 1726853786.67527: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853786.67598: no more pending results, returning what we have 30583 1726853786.67602: results queue empty 30583 1726853786.67603: checking for any_errors_fatal 30583 1726853786.67610: done checking for any_errors_fatal 30583 1726853786.67610: checking for max_fail_percentage 30583 1726853786.67612: done checking for max_fail_percentage 30583 1726853786.67613: checking to see if all hosts have failed and the running result is not ok 30583 1726853786.67613: done checking to see if all hosts have failed 30583 1726853786.67614: getting the remaining hosts for this loop 30583 1726853786.67616: done getting the remaining hosts for this loop 30583 1726853786.67619: getting the next task for host managed_node2 30583 1726853786.67627: done getting next task for host managed_node2 30583 1726853786.67631: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853786.67636: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853786.67661: getting variables 30583 1726853786.67663: in VariableManager get_vars() 30583 1726853786.67705: Calling all_inventory to load vars for managed_node2 30583 1726853786.67707: Calling groups_inventory to load vars for managed_node2 30583 1726853786.67709: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853786.67717: Calling all_plugins_play to load vars for managed_node2 30583 1726853786.67720: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853786.67722: Calling groups_plugins_play to load vars for managed_node2 30583 1726853786.68646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853786.69497: done with get_vars() 30583 1726853786.69522: done getting variables 30583 1726853786.69567: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:36:26 -0400 (0:00:00.033) 0:02:02.033 ****** 30583 1726853786.69596: entering _queue_task() for managed_node2/fail 30583 1726853786.69868: worker is 1 (out of 1 available) 30583 1726853786.69884: exiting _queue_task() for managed_node2/fail 30583 1726853786.69897: done queuing things up, now waiting for results queue to drain 30583 1726853786.69898: waiting for pending results... 30583 1726853786.70096: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853786.70206: in run() - task 02083763-bbaf-05ea-abc5-0000000024a7 30583 1726853786.70217: variable 'ansible_search_path' from source: unknown 30583 1726853786.70221: variable 'ansible_search_path' from source: unknown 30583 1726853786.70254: calling self._execute() 30583 1726853786.70337: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853786.70341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853786.70350: variable 'omit' from source: magic vars 30583 1726853786.70640: variable 'ansible_distribution_major_version' from source: facts 30583 1726853786.70650: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853786.70735: variable 'network_state' from source: role '' defaults 30583 1726853786.70744: Evaluated conditional (network_state != {}): False 30583 1726853786.70747: when evaluation is False, skipping this task 30583 1726853786.70750: _execute() done 30583 1726853786.70752: dumping result to json 30583 1726853786.70754: done dumping result, returning 30583 1726853786.70765: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-0000000024a7] 30583 1726853786.70769: sending task result for task 02083763-bbaf-05ea-abc5-0000000024a7 30583 1726853786.70857: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024a7 30583 1726853786.70860: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853786.70935: no more pending results, returning what we have 30583 1726853786.70938: results queue empty 30583 1726853786.70939: checking for any_errors_fatal 30583 1726853786.70948: done checking for any_errors_fatal 30583 1726853786.70949: checking for max_fail_percentage 30583 1726853786.70950: done checking for max_fail_percentage 30583 1726853786.70951: checking to see if all hosts have failed and the running result is not ok 30583 1726853786.70952: done checking to see if all hosts have failed 30583 1726853786.70953: getting the remaining hosts for this loop 30583 1726853786.70955: done getting the remaining hosts for this loop 30583 1726853786.70959: getting the next task for host managed_node2 30583 1726853786.70966: done getting next task for host managed_node2 30583 1726853786.70970: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853786.70977: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853786.71005: getting variables 30583 1726853786.71006: in VariableManager get_vars() 30583 1726853786.71047: Calling all_inventory to load vars for managed_node2 30583 1726853786.71049: Calling groups_inventory to load vars for managed_node2 30583 1726853786.71051: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853786.71059: Calling all_plugins_play to load vars for managed_node2 30583 1726853786.71062: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853786.71064: Calling groups_plugins_play to load vars for managed_node2 30583 1726853786.71840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853786.72701: done with get_vars() 30583 1726853786.72719: done getting variables 30583 1726853786.72761: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:36:26 -0400 (0:00:00.031) 0:02:02.065 ****** 30583 1726853786.72788: entering _queue_task() for managed_node2/fail 30583 1726853786.73040: worker is 1 (out of 1 available) 30583 1726853786.73056: exiting _queue_task() for managed_node2/fail 30583 1726853786.73070: done queuing things up, now waiting for results queue to drain 30583 1726853786.73073: waiting for pending results... 30583 1726853786.73268: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853786.73375: in run() - task 02083763-bbaf-05ea-abc5-0000000024a8 30583 1726853786.73386: variable 'ansible_search_path' from source: unknown 30583 1726853786.73389: variable 'ansible_search_path' from source: unknown 30583 1726853786.73421: calling self._execute() 30583 1726853786.73502: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853786.73506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853786.73517: variable 'omit' from source: magic vars 30583 1726853786.73804: variable 'ansible_distribution_major_version' from source: facts 30583 1726853786.73813: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853786.73937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853786.76287: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853786.76328: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853786.76355: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853786.76397: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853786.76417: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853786.76479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853786.76502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853786.76519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853786.76544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853786.76555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853786.76627: variable 'ansible_distribution_major_version' from source: facts 30583 1726853786.76643: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853786.76725: variable 'ansible_distribution' from source: facts 30583 1726853786.76729: variable '__network_rh_distros' from source: role '' defaults 30583 1726853786.76737: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853786.76894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853786.76911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853786.76929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853786.76955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853786.76968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853786.77001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853786.77017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853786.77033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853786.77062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853786.77074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853786.77102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853786.77118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853786.77133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853786.77160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853786.77174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853786.77356: variable 'network_connections' from source: include params 30583 1726853786.77369: variable 'interface' from source: play vars 30583 1726853786.77415: variable 'interface' from source: play vars 30583 1726853786.77423: variable 'network_state' from source: role '' defaults 30583 1726853786.77473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853786.77584: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853786.77612: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853786.77634: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853786.77654: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853786.77689: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853786.77707: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853786.77728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853786.77745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853786.77767: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853786.77770: when evaluation is False, skipping this task 30583 1726853786.77774: _execute() done 30583 1726853786.77776: dumping result to json 30583 1726853786.77778: done dumping result, returning 30583 1726853786.77786: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-0000000024a8] 30583 1726853786.77788: sending task result for task 02083763-bbaf-05ea-abc5-0000000024a8 30583 1726853786.77875: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024a8 30583 1726853786.77878: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853786.77953: no more pending results, returning what we have 30583 1726853786.77956: results queue empty 30583 1726853786.77957: checking for any_errors_fatal 30583 1726853786.77963: done checking for any_errors_fatal 30583 1726853786.77964: checking for max_fail_percentage 30583 1726853786.77966: done checking for max_fail_percentage 30583 1726853786.77967: checking to see if all hosts have failed and the running result is not ok 30583 1726853786.77968: done checking to see if all hosts have failed 30583 1726853786.77969: getting the remaining hosts for this loop 30583 1726853786.77973: done getting the remaining hosts for this loop 30583 1726853786.77976: getting the next task for host managed_node2 30583 1726853786.77984: done getting next task for host managed_node2 30583 1726853786.77988: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853786.77993: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853786.78021: getting variables 30583 1726853786.78022: in VariableManager get_vars() 30583 1726853786.78066: Calling all_inventory to load vars for managed_node2 30583 1726853786.78069: Calling groups_inventory to load vars for managed_node2 30583 1726853786.78076: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853786.78085: Calling all_plugins_play to load vars for managed_node2 30583 1726853786.78087: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853786.78090: Calling groups_plugins_play to load vars for managed_node2 30583 1726853786.79017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853786.79876: done with get_vars() 30583 1726853786.79896: done getting variables 30583 1726853786.79939: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:36:26 -0400 (0:00:00.071) 0:02:02.136 ****** 30583 1726853786.79964: entering _queue_task() for managed_node2/dnf 30583 1726853786.80224: worker is 1 (out of 1 available) 30583 1726853786.80239: exiting _queue_task() for managed_node2/dnf 30583 1726853786.80253: done queuing things up, now waiting for results queue to drain 30583 1726853786.80255: waiting for pending results... 30583 1726853786.80455: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853786.80577: in run() - task 02083763-bbaf-05ea-abc5-0000000024a9 30583 1726853786.80593: variable 'ansible_search_path' from source: unknown 30583 1726853786.80597: variable 'ansible_search_path' from source: unknown 30583 1726853786.80626: calling self._execute() 30583 1726853786.80710: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853786.80714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853786.80724: variable 'omit' from source: magic vars 30583 1726853786.81017: variable 'ansible_distribution_major_version' from source: facts 30583 1726853786.81027: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853786.81169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853786.82707: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853786.82759: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853786.82791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853786.82817: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853786.82837: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853786.82899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853786.82919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853786.82936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853786.82964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853786.82976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853786.83060: variable 'ansible_distribution' from source: facts 30583 1726853786.83067: variable 'ansible_distribution_major_version' from source: facts 30583 1726853786.83083: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853786.83157: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853786.83246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853786.83265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853786.83284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853786.83311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853786.83322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853786.83348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853786.83366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853786.83384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853786.83408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853786.83420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853786.83447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853786.83465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853786.83483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853786.83506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853786.83516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853786.83619: variable 'network_connections' from source: include params 30583 1726853786.83630: variable 'interface' from source: play vars 30583 1726853786.83677: variable 'interface' from source: play vars 30583 1726853786.83727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853786.83838: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853786.83869: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853786.83893: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853786.83913: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853786.83954: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853786.83977: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853786.84000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853786.84017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853786.84052: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853786.84210: variable 'network_connections' from source: include params 30583 1726853786.84213: variable 'interface' from source: play vars 30583 1726853786.84257: variable 'interface' from source: play vars 30583 1726853786.84281: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853786.84285: when evaluation is False, skipping this task 30583 1726853786.84289: _execute() done 30583 1726853786.84291: dumping result to json 30583 1726853786.84293: done dumping result, returning 30583 1726853786.84300: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000024a9] 30583 1726853786.84303: sending task result for task 02083763-bbaf-05ea-abc5-0000000024a9 30583 1726853786.84394: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024a9 30583 1726853786.84396: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853786.84448: no more pending results, returning what we have 30583 1726853786.84452: results queue empty 30583 1726853786.84453: checking for any_errors_fatal 30583 1726853786.84459: done checking for any_errors_fatal 30583 1726853786.84460: checking for max_fail_percentage 30583 1726853786.84462: done checking for max_fail_percentage 30583 1726853786.84463: checking to see if all hosts have failed and the running result is not ok 30583 1726853786.84464: done checking to see if all hosts have failed 30583 1726853786.84464: getting the remaining hosts for this loop 30583 1726853786.84466: done getting the remaining hosts for this loop 30583 1726853786.84472: getting the next task for host managed_node2 30583 1726853786.84481: done getting next task for host managed_node2 30583 1726853786.84485: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853786.84490: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853786.84521: getting variables 30583 1726853786.84522: in VariableManager get_vars() 30583 1726853786.84569: Calling all_inventory to load vars for managed_node2 30583 1726853786.84576: Calling groups_inventory to load vars for managed_node2 30583 1726853786.84579: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853786.84588: Calling all_plugins_play to load vars for managed_node2 30583 1726853786.84590: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853786.84592: Calling groups_plugins_play to load vars for managed_node2 30583 1726853786.85417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853786.86397: done with get_vars() 30583 1726853786.86416: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853786.86472: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:36:26 -0400 (0:00:00.065) 0:02:02.202 ****** 30583 1726853786.86496: entering _queue_task() for managed_node2/yum 30583 1726853786.86757: worker is 1 (out of 1 available) 30583 1726853786.86770: exiting _queue_task() for managed_node2/yum 30583 1726853786.86783: done queuing things up, now waiting for results queue to drain 30583 1726853786.86785: waiting for pending results... 30583 1726853786.86978: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853786.87091: in run() - task 02083763-bbaf-05ea-abc5-0000000024aa 30583 1726853786.87102: variable 'ansible_search_path' from source: unknown 30583 1726853786.87106: variable 'ansible_search_path' from source: unknown 30583 1726853786.87138: calling self._execute() 30583 1726853786.87217: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853786.87221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853786.87231: variable 'omit' from source: magic vars 30583 1726853786.87521: variable 'ansible_distribution_major_version' from source: facts 30583 1726853786.87530: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853786.87651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853786.94287: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853786.94327: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853786.94352: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853786.94378: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853786.94408: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853786.94455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853786.94478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853786.94496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853786.94524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853786.94534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853786.94599: variable 'ansible_distribution_major_version' from source: facts 30583 1726853786.94610: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853786.94613: when evaluation is False, skipping this task 30583 1726853786.94617: _execute() done 30583 1726853786.94619: dumping result to json 30583 1726853786.94621: done dumping result, returning 30583 1726853786.94630: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000024aa] 30583 1726853786.94632: sending task result for task 02083763-bbaf-05ea-abc5-0000000024aa 30583 1726853786.94718: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024aa 30583 1726853786.94721: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853786.94777: no more pending results, returning what we have 30583 1726853786.94781: results queue empty 30583 1726853786.94782: checking for any_errors_fatal 30583 1726853786.94788: done checking for any_errors_fatal 30583 1726853786.94789: checking for max_fail_percentage 30583 1726853786.94791: done checking for max_fail_percentage 30583 1726853786.94791: checking to see if all hosts have failed and the running result is not ok 30583 1726853786.94792: done checking to see if all hosts have failed 30583 1726853786.94793: getting the remaining hosts for this loop 30583 1726853786.94795: done getting the remaining hosts for this loop 30583 1726853786.94798: getting the next task for host managed_node2 30583 1726853786.94804: done getting next task for host managed_node2 30583 1726853786.94807: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853786.94812: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853786.94837: getting variables 30583 1726853786.94840: in VariableManager get_vars() 30583 1726853786.94885: Calling all_inventory to load vars for managed_node2 30583 1726853786.94887: Calling groups_inventory to load vars for managed_node2 30583 1726853786.94890: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853786.94898: Calling all_plugins_play to load vars for managed_node2 30583 1726853786.94900: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853786.94903: Calling groups_plugins_play to load vars for managed_node2 30583 1726853787.00127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853787.00973: done with get_vars() 30583 1726853787.00991: done getting variables 30583 1726853787.01025: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:36:27 -0400 (0:00:00.145) 0:02:02.347 ****** 30583 1726853787.01049: entering _queue_task() for managed_node2/fail 30583 1726853787.01333: worker is 1 (out of 1 available) 30583 1726853787.01346: exiting _queue_task() for managed_node2/fail 30583 1726853787.01359: done queuing things up, now waiting for results queue to drain 30583 1726853787.01361: waiting for pending results... 30583 1726853787.01558: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853787.01676: in run() - task 02083763-bbaf-05ea-abc5-0000000024ab 30583 1726853787.01692: variable 'ansible_search_path' from source: unknown 30583 1726853787.01697: variable 'ansible_search_path' from source: unknown 30583 1726853787.01727: calling self._execute() 30583 1726853787.01806: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853787.01813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853787.01822: variable 'omit' from source: magic vars 30583 1726853787.02114: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.02124: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853787.02217: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853787.02351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853787.03860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853787.03917: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853787.03942: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853787.03970: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853787.03993: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853787.04049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.04073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.04096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.04120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.04130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.04163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.04182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.04200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.04226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.04236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.04265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.04283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.04299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.04325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.04335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.04449: variable 'network_connections' from source: include params 30583 1726853787.04460: variable 'interface' from source: play vars 30583 1726853787.04507: variable 'interface' from source: play vars 30583 1726853787.04560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853787.04681: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853787.04709: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853787.04731: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853787.04864: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853787.04867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853787.04870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853787.04876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.04878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853787.04881: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853787.05024: variable 'network_connections' from source: include params 30583 1726853787.05027: variable 'interface' from source: play vars 30583 1726853787.05074: variable 'interface' from source: play vars 30583 1726853787.05093: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853787.05097: when evaluation is False, skipping this task 30583 1726853787.05100: _execute() done 30583 1726853787.05102: dumping result to json 30583 1726853787.05104: done dumping result, returning 30583 1726853787.05110: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000024ab] 30583 1726853787.05115: sending task result for task 02083763-bbaf-05ea-abc5-0000000024ab 30583 1726853787.05201: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024ab 30583 1726853787.05203: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853787.05250: no more pending results, returning what we have 30583 1726853787.05254: results queue empty 30583 1726853787.05254: checking for any_errors_fatal 30583 1726853787.05264: done checking for any_errors_fatal 30583 1726853787.05264: checking for max_fail_percentage 30583 1726853787.05266: done checking for max_fail_percentage 30583 1726853787.05267: checking to see if all hosts have failed and the running result is not ok 30583 1726853787.05268: done checking to see if all hosts have failed 30583 1726853787.05269: getting the remaining hosts for this loop 30583 1726853787.05272: done getting the remaining hosts for this loop 30583 1726853787.05276: getting the next task for host managed_node2 30583 1726853787.05285: done getting next task for host managed_node2 30583 1726853787.05289: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853787.05293: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853787.05324: getting variables 30583 1726853787.05325: in VariableManager get_vars() 30583 1726853787.05376: Calling all_inventory to load vars for managed_node2 30583 1726853787.05379: Calling groups_inventory to load vars for managed_node2 30583 1726853787.05381: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853787.05390: Calling all_plugins_play to load vars for managed_node2 30583 1726853787.05393: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853787.05395: Calling groups_plugins_play to load vars for managed_node2 30583 1726853787.06201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853787.07178: done with get_vars() 30583 1726853787.07194: done getting variables 30583 1726853787.07238: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:36:27 -0400 (0:00:00.062) 0:02:02.409 ****** 30583 1726853787.07267: entering _queue_task() for managed_node2/package 30583 1726853787.07529: worker is 1 (out of 1 available) 30583 1726853787.07544: exiting _queue_task() for managed_node2/package 30583 1726853787.07558: done queuing things up, now waiting for results queue to drain 30583 1726853787.07560: waiting for pending results... 30583 1726853787.07752: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853787.07875: in run() - task 02083763-bbaf-05ea-abc5-0000000024ac 30583 1726853787.07884: variable 'ansible_search_path' from source: unknown 30583 1726853787.07889: variable 'ansible_search_path' from source: unknown 30583 1726853787.07919: calling self._execute() 30583 1726853787.08001: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853787.08005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853787.08075: variable 'omit' from source: magic vars 30583 1726853787.08308: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.08316: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853787.08449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853787.08649: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853787.08686: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853787.08714: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853787.08772: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853787.08856: variable 'network_packages' from source: role '' defaults 30583 1726853787.08929: variable '__network_provider_setup' from source: role '' defaults 30583 1726853787.08937: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853787.08984: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853787.08993: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853787.09036: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853787.09150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853787.10483: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853787.10527: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853787.10554: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853787.10580: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853787.10599: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853787.10666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.10688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.10705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.10730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.10743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.10776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.10793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.10809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.10833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.10845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.10986: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853787.11055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.11077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.11095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.11118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.11129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.11194: variable 'ansible_python' from source: facts 30583 1726853787.11207: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853787.11263: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853787.11317: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853787.11399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.11415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.11432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.11456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.11467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.11503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.11521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.11537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.11563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.11574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.11669: variable 'network_connections' from source: include params 30583 1726853787.11674: variable 'interface' from source: play vars 30583 1726853787.11743: variable 'interface' from source: play vars 30583 1726853787.11796: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853787.11815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853787.11838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.11863: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853787.11899: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853787.12075: variable 'network_connections' from source: include params 30583 1726853787.12079: variable 'interface' from source: play vars 30583 1726853787.12150: variable 'interface' from source: play vars 30583 1726853787.12174: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853787.12227: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853787.12420: variable 'network_connections' from source: include params 30583 1726853787.12423: variable 'interface' from source: play vars 30583 1726853787.12467: variable 'interface' from source: play vars 30583 1726853787.12486: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853787.12539: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853787.12736: variable 'network_connections' from source: include params 30583 1726853787.12739: variable 'interface' from source: play vars 30583 1726853787.12786: variable 'interface' from source: play vars 30583 1726853787.12825: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853787.12866: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853787.12874: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853787.12916: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853787.13049: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853787.13345: variable 'network_connections' from source: include params 30583 1726853787.13348: variable 'interface' from source: play vars 30583 1726853787.13394: variable 'interface' from source: play vars 30583 1726853787.13400: variable 'ansible_distribution' from source: facts 30583 1726853787.13402: variable '__network_rh_distros' from source: role '' defaults 30583 1726853787.13409: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.13420: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853787.13527: variable 'ansible_distribution' from source: facts 30583 1726853787.13530: variable '__network_rh_distros' from source: role '' defaults 30583 1726853787.13534: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.13545: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853787.13648: variable 'ansible_distribution' from source: facts 30583 1726853787.13651: variable '__network_rh_distros' from source: role '' defaults 30583 1726853787.13655: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.13688: variable 'network_provider' from source: set_fact 30583 1726853787.13696: variable 'ansible_facts' from source: unknown 30583 1726853787.14137: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853787.14140: when evaluation is False, skipping this task 30583 1726853787.14143: _execute() done 30583 1726853787.14145: dumping result to json 30583 1726853787.14147: done dumping result, returning 30583 1726853787.14156: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-0000000024ac] 30583 1726853787.14161: sending task result for task 02083763-bbaf-05ea-abc5-0000000024ac 30583 1726853787.14256: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024ac 30583 1726853787.14262: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853787.14311: no more pending results, returning what we have 30583 1726853787.14315: results queue empty 30583 1726853787.14316: checking for any_errors_fatal 30583 1726853787.14322: done checking for any_errors_fatal 30583 1726853787.14322: checking for max_fail_percentage 30583 1726853787.14324: done checking for max_fail_percentage 30583 1726853787.14325: checking to see if all hosts have failed and the running result is not ok 30583 1726853787.14326: done checking to see if all hosts have failed 30583 1726853787.14327: getting the remaining hosts for this loop 30583 1726853787.14329: done getting the remaining hosts for this loop 30583 1726853787.14332: getting the next task for host managed_node2 30583 1726853787.14340: done getting next task for host managed_node2 30583 1726853787.14344: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853787.14348: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853787.14382: getting variables 30583 1726853787.14383: in VariableManager get_vars() 30583 1726853787.14435: Calling all_inventory to load vars for managed_node2 30583 1726853787.14438: Calling groups_inventory to load vars for managed_node2 30583 1726853787.14440: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853787.14449: Calling all_plugins_play to load vars for managed_node2 30583 1726853787.14451: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853787.14454: Calling groups_plugins_play to load vars for managed_node2 30583 1726853787.15281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853787.16149: done with get_vars() 30583 1726853787.16172: done getting variables 30583 1726853787.16216: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:36:27 -0400 (0:00:00.089) 0:02:02.499 ****** 30583 1726853787.16243: entering _queue_task() for managed_node2/package 30583 1726853787.16507: worker is 1 (out of 1 available) 30583 1726853787.16520: exiting _queue_task() for managed_node2/package 30583 1726853787.16534: done queuing things up, now waiting for results queue to drain 30583 1726853787.16535: waiting for pending results... 30583 1726853787.16737: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853787.16849: in run() - task 02083763-bbaf-05ea-abc5-0000000024ad 30583 1726853787.16862: variable 'ansible_search_path' from source: unknown 30583 1726853787.16867: variable 'ansible_search_path' from source: unknown 30583 1726853787.16900: calling self._execute() 30583 1726853787.16986: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853787.16990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853787.16997: variable 'omit' from source: magic vars 30583 1726853787.17290: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.17300: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853787.17390: variable 'network_state' from source: role '' defaults 30583 1726853787.17399: Evaluated conditional (network_state != {}): False 30583 1726853787.17402: when evaluation is False, skipping this task 30583 1726853787.17404: _execute() done 30583 1726853787.17407: dumping result to json 30583 1726853787.17409: done dumping result, returning 30583 1726853787.17421: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-0000000024ad] 30583 1726853787.17424: sending task result for task 02083763-bbaf-05ea-abc5-0000000024ad 30583 1726853787.17514: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024ad 30583 1726853787.17517: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853787.17565: no more pending results, returning what we have 30583 1726853787.17570: results queue empty 30583 1726853787.17575: checking for any_errors_fatal 30583 1726853787.17585: done checking for any_errors_fatal 30583 1726853787.17585: checking for max_fail_percentage 30583 1726853787.17587: done checking for max_fail_percentage 30583 1726853787.17588: checking to see if all hosts have failed and the running result is not ok 30583 1726853787.17589: done checking to see if all hosts have failed 30583 1726853787.17590: getting the remaining hosts for this loop 30583 1726853787.17591: done getting the remaining hosts for this loop 30583 1726853787.17595: getting the next task for host managed_node2 30583 1726853787.17603: done getting next task for host managed_node2 30583 1726853787.17607: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853787.17612: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853787.17641: getting variables 30583 1726853787.17643: in VariableManager get_vars() 30583 1726853787.17691: Calling all_inventory to load vars for managed_node2 30583 1726853787.17694: Calling groups_inventory to load vars for managed_node2 30583 1726853787.17696: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853787.17705: Calling all_plugins_play to load vars for managed_node2 30583 1726853787.17708: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853787.17711: Calling groups_plugins_play to load vars for managed_node2 30583 1726853787.18638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853787.19489: done with get_vars() 30583 1726853787.19506: done getting variables 30583 1726853787.19550: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:36:27 -0400 (0:00:00.033) 0:02:02.533 ****** 30583 1726853787.19579: entering _queue_task() for managed_node2/package 30583 1726853787.19837: worker is 1 (out of 1 available) 30583 1726853787.19854: exiting _queue_task() for managed_node2/package 30583 1726853787.19868: done queuing things up, now waiting for results queue to drain 30583 1726853787.19870: waiting for pending results... 30583 1726853787.20065: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853787.20180: in run() - task 02083763-bbaf-05ea-abc5-0000000024ae 30583 1726853787.20191: variable 'ansible_search_path' from source: unknown 30583 1726853787.20194: variable 'ansible_search_path' from source: unknown 30583 1726853787.20224: calling self._execute() 30583 1726853787.20302: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853787.20309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853787.20318: variable 'omit' from source: magic vars 30583 1726853787.20612: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.20620: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853787.20715: variable 'network_state' from source: role '' defaults 30583 1726853787.20722: Evaluated conditional (network_state != {}): False 30583 1726853787.20725: when evaluation is False, skipping this task 30583 1726853787.20728: _execute() done 30583 1726853787.20730: dumping result to json 30583 1726853787.20733: done dumping result, returning 30583 1726853787.20741: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-0000000024ae] 30583 1726853787.20747: sending task result for task 02083763-bbaf-05ea-abc5-0000000024ae 30583 1726853787.20836: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024ae 30583 1726853787.20840: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853787.20903: no more pending results, returning what we have 30583 1726853787.20907: results queue empty 30583 1726853787.20908: checking for any_errors_fatal 30583 1726853787.20918: done checking for any_errors_fatal 30583 1726853787.20918: checking for max_fail_percentage 30583 1726853787.20920: done checking for max_fail_percentage 30583 1726853787.20921: checking to see if all hosts have failed and the running result is not ok 30583 1726853787.20922: done checking to see if all hosts have failed 30583 1726853787.20922: getting the remaining hosts for this loop 30583 1726853787.20924: done getting the remaining hosts for this loop 30583 1726853787.20928: getting the next task for host managed_node2 30583 1726853787.20936: done getting next task for host managed_node2 30583 1726853787.20940: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853787.20944: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853787.20970: getting variables 30583 1726853787.20974: in VariableManager get_vars() 30583 1726853787.21014: Calling all_inventory to load vars for managed_node2 30583 1726853787.21017: Calling groups_inventory to load vars for managed_node2 30583 1726853787.21019: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853787.21027: Calling all_plugins_play to load vars for managed_node2 30583 1726853787.21030: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853787.21032: Calling groups_plugins_play to load vars for managed_node2 30583 1726853787.21807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853787.22679: done with get_vars() 30583 1726853787.22696: done getting variables 30583 1726853787.22737: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:36:27 -0400 (0:00:00.031) 0:02:02.564 ****** 30583 1726853787.22764: entering _queue_task() for managed_node2/service 30583 1726853787.23005: worker is 1 (out of 1 available) 30583 1726853787.23018: exiting _queue_task() for managed_node2/service 30583 1726853787.23030: done queuing things up, now waiting for results queue to drain 30583 1726853787.23032: waiting for pending results... 30583 1726853787.23221: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853787.23330: in run() - task 02083763-bbaf-05ea-abc5-0000000024af 30583 1726853787.23340: variable 'ansible_search_path' from source: unknown 30583 1726853787.23344: variable 'ansible_search_path' from source: unknown 30583 1726853787.23379: calling self._execute() 30583 1726853787.23456: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853787.23460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853787.23474: variable 'omit' from source: magic vars 30583 1726853787.23755: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.23766: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853787.23854: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853787.23992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853787.25790: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853787.25833: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853787.25876: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853787.25902: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853787.25923: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853787.25984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.26004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.26021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.26046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.26056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.26095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.26112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.26129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.26153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.26166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.26197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.26212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.26228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.26252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.26265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.26385: variable 'network_connections' from source: include params 30583 1726853787.26395: variable 'interface' from source: play vars 30583 1726853787.26447: variable 'interface' from source: play vars 30583 1726853787.26501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853787.26609: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853787.26639: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853787.26664: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853787.26695: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853787.26723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853787.26740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853787.26762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.26783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853787.26822: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853787.26987: variable 'network_connections' from source: include params 30583 1726853787.26990: variable 'interface' from source: play vars 30583 1726853787.27033: variable 'interface' from source: play vars 30583 1726853787.27051: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853787.27055: when evaluation is False, skipping this task 30583 1726853787.27057: _execute() done 30583 1726853787.27061: dumping result to json 30583 1726853787.27064: done dumping result, returning 30583 1726853787.27075: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-0000000024af] 30583 1726853787.27079: sending task result for task 02083763-bbaf-05ea-abc5-0000000024af 30583 1726853787.27170: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024af 30583 1726853787.27184: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853787.27231: no more pending results, returning what we have 30583 1726853787.27234: results queue empty 30583 1726853787.27235: checking for any_errors_fatal 30583 1726853787.27242: done checking for any_errors_fatal 30583 1726853787.27242: checking for max_fail_percentage 30583 1726853787.27244: done checking for max_fail_percentage 30583 1726853787.27245: checking to see if all hosts have failed and the running result is not ok 30583 1726853787.27246: done checking to see if all hosts have failed 30583 1726853787.27247: getting the remaining hosts for this loop 30583 1726853787.27249: done getting the remaining hosts for this loop 30583 1726853787.27252: getting the next task for host managed_node2 30583 1726853787.27260: done getting next task for host managed_node2 30583 1726853787.27264: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853787.27268: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853787.27301: getting variables 30583 1726853787.27303: in VariableManager get_vars() 30583 1726853787.27349: Calling all_inventory to load vars for managed_node2 30583 1726853787.27352: Calling groups_inventory to load vars for managed_node2 30583 1726853787.27354: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853787.27363: Calling all_plugins_play to load vars for managed_node2 30583 1726853787.27366: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853787.27368: Calling groups_plugins_play to load vars for managed_node2 30583 1726853787.28325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853787.29189: done with get_vars() 30583 1726853787.29207: done getting variables 30583 1726853787.29251: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:36:27 -0400 (0:00:00.065) 0:02:02.630 ****** 30583 1726853787.29278: entering _queue_task() for managed_node2/service 30583 1726853787.29538: worker is 1 (out of 1 available) 30583 1726853787.29552: exiting _queue_task() for managed_node2/service 30583 1726853787.29566: done queuing things up, now waiting for results queue to drain 30583 1726853787.29568: waiting for pending results... 30583 1726853787.29765: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853787.29878: in run() - task 02083763-bbaf-05ea-abc5-0000000024b0 30583 1726853787.29889: variable 'ansible_search_path' from source: unknown 30583 1726853787.29893: variable 'ansible_search_path' from source: unknown 30583 1726853787.29924: calling self._execute() 30583 1726853787.30009: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853787.30012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853787.30021: variable 'omit' from source: magic vars 30583 1726853787.30313: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.30323: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853787.30438: variable 'network_provider' from source: set_fact 30583 1726853787.30442: variable 'network_state' from source: role '' defaults 30583 1726853787.30452: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853787.30456: variable 'omit' from source: magic vars 30583 1726853787.30504: variable 'omit' from source: magic vars 30583 1726853787.30522: variable 'network_service_name' from source: role '' defaults 30583 1726853787.30575: variable 'network_service_name' from source: role '' defaults 30583 1726853787.30643: variable '__network_provider_setup' from source: role '' defaults 30583 1726853787.30646: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853787.30694: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853787.30702: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853787.30744: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853787.30897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853787.32376: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853787.32432: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853787.32459: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853787.32489: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853787.32509: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853787.32568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.32590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.32608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.32639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.32650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.32685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.32702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.32718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.32746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.32756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.32915: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853787.32994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.33011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.33027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.33051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.33066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.33132: variable 'ansible_python' from source: facts 30583 1726853787.33143: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853787.33203: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853787.33253: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853787.33337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.33354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.33375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.33402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.33412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.33444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.33466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.33483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.33510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.33521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.33612: variable 'network_connections' from source: include params 30583 1726853787.33619: variable 'interface' from source: play vars 30583 1726853787.33673: variable 'interface' from source: play vars 30583 1726853787.33745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853787.33884: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853787.33918: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853787.33952: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853787.33986: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853787.34028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853787.34051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853787.34077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.34100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853787.34137: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853787.34318: variable 'network_connections' from source: include params 30583 1726853787.34323: variable 'interface' from source: play vars 30583 1726853787.34379: variable 'interface' from source: play vars 30583 1726853787.34403: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853787.34455: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853787.34641: variable 'network_connections' from source: include params 30583 1726853787.34644: variable 'interface' from source: play vars 30583 1726853787.34698: variable 'interface' from source: play vars 30583 1726853787.34714: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853787.34769: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853787.34952: variable 'network_connections' from source: include params 30583 1726853787.34955: variable 'interface' from source: play vars 30583 1726853787.35009: variable 'interface' from source: play vars 30583 1726853787.35045: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853787.35090: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853787.35095: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853787.35137: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853787.35273: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853787.35588: variable 'network_connections' from source: include params 30583 1726853787.35591: variable 'interface' from source: play vars 30583 1726853787.35632: variable 'interface' from source: play vars 30583 1726853787.35638: variable 'ansible_distribution' from source: facts 30583 1726853787.35641: variable '__network_rh_distros' from source: role '' defaults 30583 1726853787.35647: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.35657: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853787.35770: variable 'ansible_distribution' from source: facts 30583 1726853787.35775: variable '__network_rh_distros' from source: role '' defaults 30583 1726853787.35778: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.35792: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853787.35901: variable 'ansible_distribution' from source: facts 30583 1726853787.35904: variable '__network_rh_distros' from source: role '' defaults 30583 1726853787.35909: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.35933: variable 'network_provider' from source: set_fact 30583 1726853787.35951: variable 'omit' from source: magic vars 30583 1726853787.35977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853787.35999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853787.36014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853787.36027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853787.36036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853787.36059: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853787.36065: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853787.36067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853787.36138: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853787.36144: Set connection var ansible_timeout to 10 30583 1726853787.36146: Set connection var ansible_connection to ssh 30583 1726853787.36151: Set connection var ansible_shell_executable to /bin/sh 30583 1726853787.36154: Set connection var ansible_shell_type to sh 30583 1726853787.36163: Set connection var ansible_pipelining to False 30583 1726853787.36185: variable 'ansible_shell_executable' from source: unknown 30583 1726853787.36187: variable 'ansible_connection' from source: unknown 30583 1726853787.36190: variable 'ansible_module_compression' from source: unknown 30583 1726853787.36192: variable 'ansible_shell_type' from source: unknown 30583 1726853787.36195: variable 'ansible_shell_executable' from source: unknown 30583 1726853787.36197: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853787.36199: variable 'ansible_pipelining' from source: unknown 30583 1726853787.36203: variable 'ansible_timeout' from source: unknown 30583 1726853787.36206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853787.36283: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853787.36292: variable 'omit' from source: magic vars 30583 1726853787.36297: starting attempt loop 30583 1726853787.36299: running the handler 30583 1726853787.36354: variable 'ansible_facts' from source: unknown 30583 1726853787.36933: _low_level_execute_command(): starting 30583 1726853787.36939: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853787.37431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853787.37436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853787.37439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853787.37441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853787.37495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853787.37498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853787.37501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853787.37586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853787.39325: stdout chunk (state=3): >>>/root <<< 30583 1726853787.39428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853787.39460: stderr chunk (state=3): >>><<< 30583 1726853787.39463: stdout chunk (state=3): >>><<< 30583 1726853787.39482: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853787.39492: _low_level_execute_command(): starting 30583 1726853787.39499: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938 `" && echo ansible-tmp-1726853787.3948233-35942-245803616385938="` echo /root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938 `" ) && sleep 0' 30583 1726853787.39936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853787.39939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853787.39941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853787.39943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853787.39945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853787.39997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853787.40001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853787.40005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853787.40078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853787.42102: stdout chunk (state=3): >>>ansible-tmp-1726853787.3948233-35942-245803616385938=/root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938 <<< 30583 1726853787.42212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853787.42237: stderr chunk (state=3): >>><<< 30583 1726853787.42240: stdout chunk (state=3): >>><<< 30583 1726853787.42254: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853787.3948233-35942-245803616385938=/root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853787.42285: variable 'ansible_module_compression' from source: unknown 30583 1726853787.42322: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853787.42369: variable 'ansible_facts' from source: unknown 30583 1726853787.42503: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938/AnsiballZ_systemd.py 30583 1726853787.42599: Sending initial data 30583 1726853787.42602: Sent initial data (156 bytes) 30583 1726853787.43041: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853787.43046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853787.43053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853787.43056: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853787.43060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853787.43107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853787.43113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853787.43115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853787.43195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853787.44842: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853787.44846: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853787.44908: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853787.44986: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpki6wc9u2 /root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938/AnsiballZ_systemd.py <<< 30583 1726853787.44990: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938/AnsiballZ_systemd.py" <<< 30583 1726853787.45053: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpki6wc9u2" to remote "/root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938/AnsiballZ_systemd.py" <<< 30583 1726853787.46255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853787.46293: stderr chunk (state=3): >>><<< 30583 1726853787.46298: stdout chunk (state=3): >>><<< 30583 1726853787.46333: done transferring module to remote 30583 1726853787.46341: _low_level_execute_command(): starting 30583 1726853787.46346: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938/ /root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938/AnsiballZ_systemd.py && sleep 0' 30583 1726853787.46783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853787.46786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853787.46789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853787.46791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853787.46793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853787.46795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853787.46841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853787.46844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853787.46848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853787.46914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853787.48778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853787.48801: stderr chunk (state=3): >>><<< 30583 1726853787.48806: stdout chunk (state=3): >>><<< 30583 1726853787.48820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853787.48824: _low_level_execute_command(): starting 30583 1726853787.48827: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938/AnsiballZ_systemd.py && sleep 0' 30583 1726853787.49249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853787.49253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853787.49255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853787.49260: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853787.49262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853787.49312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853787.49318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853787.49320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853787.49398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853787.78962: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4661248", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3295346688", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2036671000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 30583 1726853787.78969: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "syst<<< 30583 1726853787.79001: stdout chunk (state=3): >>>em.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853787.81116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853787.81144: stderr chunk (state=3): >>><<< 30583 1726853787.81148: stdout chunk (state=3): >>><<< 30583 1726853787.81166: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4661248", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3295346688", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2036671000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853787.81290: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853787.81306: _low_level_execute_command(): starting 30583 1726853787.81309: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853787.3948233-35942-245803616385938/ > /dev/null 2>&1 && sleep 0' 30583 1726853787.81749: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853787.81753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853787.81755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853787.81760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853787.81762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853787.81764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853787.81812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853787.81816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853787.81821: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853787.81895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853787.83777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853787.83802: stderr chunk (state=3): >>><<< 30583 1726853787.83805: stdout chunk (state=3): >>><<< 30583 1726853787.83820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853787.83826: handler run complete 30583 1726853787.83866: attempt loop complete, returning result 30583 1726853787.83869: _execute() done 30583 1726853787.83873: dumping result to json 30583 1726853787.83885: done dumping result, returning 30583 1726853787.83894: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-0000000024b0] 30583 1726853787.83899: sending task result for task 02083763-bbaf-05ea-abc5-0000000024b0 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853787.84218: no more pending results, returning what we have 30583 1726853787.84222: results queue empty 30583 1726853787.84223: checking for any_errors_fatal 30583 1726853787.84228: done checking for any_errors_fatal 30583 1726853787.84228: checking for max_fail_percentage 30583 1726853787.84230: done checking for max_fail_percentage 30583 1726853787.84231: checking to see if all hosts have failed and the running result is not ok 30583 1726853787.84231: done checking to see if all hosts have failed 30583 1726853787.84232: getting the remaining hosts for this loop 30583 1726853787.84234: done getting the remaining hosts for this loop 30583 1726853787.84238: getting the next task for host managed_node2 30583 1726853787.84245: done getting next task for host managed_node2 30583 1726853787.84248: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853787.84252: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853787.84266: getting variables 30583 1726853787.84268: in VariableManager get_vars() 30583 1726853787.84307: Calling all_inventory to load vars for managed_node2 30583 1726853787.84310: Calling groups_inventory to load vars for managed_node2 30583 1726853787.84312: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853787.84321: Calling all_plugins_play to load vars for managed_node2 30583 1726853787.84324: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853787.84326: Calling groups_plugins_play to load vars for managed_node2 30583 1726853787.84890: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024b0 30583 1726853787.84894: WORKER PROCESS EXITING 30583 1726853787.85253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853787.86130: done with get_vars() 30583 1726853787.86148: done getting variables 30583 1726853787.86195: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:36:27 -0400 (0:00:00.569) 0:02:03.199 ****** 30583 1726853787.86226: entering _queue_task() for managed_node2/service 30583 1726853787.86486: worker is 1 (out of 1 available) 30583 1726853787.86501: exiting _queue_task() for managed_node2/service 30583 1726853787.86514: done queuing things up, now waiting for results queue to drain 30583 1726853787.86515: waiting for pending results... 30583 1726853787.86708: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853787.86813: in run() - task 02083763-bbaf-05ea-abc5-0000000024b1 30583 1726853787.86824: variable 'ansible_search_path' from source: unknown 30583 1726853787.86828: variable 'ansible_search_path' from source: unknown 30583 1726853787.86857: calling self._execute() 30583 1726853787.86931: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853787.86934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853787.86943: variable 'omit' from source: magic vars 30583 1726853787.87236: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.87245: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853787.87325: variable 'network_provider' from source: set_fact 30583 1726853787.87329: Evaluated conditional (network_provider == "nm"): True 30583 1726853787.87394: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853787.87456: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853787.87577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853787.89007: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853787.89055: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853787.89084: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853787.89110: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853787.89132: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853787.89200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.89220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.89240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.89268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.89281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.89312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.89328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.89345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.89374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.89385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.89411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853787.89427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853787.89444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.89473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853787.89483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853787.89583: variable 'network_connections' from source: include params 30583 1726853787.89593: variable 'interface' from source: play vars 30583 1726853787.89639: variable 'interface' from source: play vars 30583 1726853787.89694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853787.89801: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853787.89827: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853787.89851: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853787.89875: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853787.89907: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853787.89922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853787.89938: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853787.89955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853787.89995: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853787.90149: variable 'network_connections' from source: include params 30583 1726853787.90153: variable 'interface' from source: play vars 30583 1726853787.90197: variable 'interface' from source: play vars 30583 1726853787.90223: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853787.90226: when evaluation is False, skipping this task 30583 1726853787.90228: _execute() done 30583 1726853787.90231: dumping result to json 30583 1726853787.90233: done dumping result, returning 30583 1726853787.90237: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-0000000024b1] 30583 1726853787.90248: sending task result for task 02083763-bbaf-05ea-abc5-0000000024b1 30583 1726853787.90332: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024b1 30583 1726853787.90335: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853787.90385: no more pending results, returning what we have 30583 1726853787.90388: results queue empty 30583 1726853787.90389: checking for any_errors_fatal 30583 1726853787.90415: done checking for any_errors_fatal 30583 1726853787.90416: checking for max_fail_percentage 30583 1726853787.90418: done checking for max_fail_percentage 30583 1726853787.90419: checking to see if all hosts have failed and the running result is not ok 30583 1726853787.90419: done checking to see if all hosts have failed 30583 1726853787.90420: getting the remaining hosts for this loop 30583 1726853787.90422: done getting the remaining hosts for this loop 30583 1726853787.90425: getting the next task for host managed_node2 30583 1726853787.90434: done getting next task for host managed_node2 30583 1726853787.90438: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853787.90442: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853787.90475: getting variables 30583 1726853787.90477: in VariableManager get_vars() 30583 1726853787.90521: Calling all_inventory to load vars for managed_node2 30583 1726853787.90523: Calling groups_inventory to load vars for managed_node2 30583 1726853787.90526: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853787.90535: Calling all_plugins_play to load vars for managed_node2 30583 1726853787.90538: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853787.90540: Calling groups_plugins_play to load vars for managed_node2 30583 1726853787.91334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853787.92207: done with get_vars() 30583 1726853787.92224: done getting variables 30583 1726853787.92267: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:36:27 -0400 (0:00:00.060) 0:02:03.260 ****** 30583 1726853787.92293: entering _queue_task() for managed_node2/service 30583 1726853787.92535: worker is 1 (out of 1 available) 30583 1726853787.92549: exiting _queue_task() for managed_node2/service 30583 1726853787.92564: done queuing things up, now waiting for results queue to drain 30583 1726853787.92565: waiting for pending results... 30583 1726853787.92752: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853787.92854: in run() - task 02083763-bbaf-05ea-abc5-0000000024b2 30583 1726853787.92867: variable 'ansible_search_path' from source: unknown 30583 1726853787.92872: variable 'ansible_search_path' from source: unknown 30583 1726853787.92905: calling self._execute() 30583 1726853787.92983: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853787.92986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853787.92995: variable 'omit' from source: magic vars 30583 1726853787.93282: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.93291: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853787.93375: variable 'network_provider' from source: set_fact 30583 1726853787.93379: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853787.93382: when evaluation is False, skipping this task 30583 1726853787.93385: _execute() done 30583 1726853787.93387: dumping result to json 30583 1726853787.93392: done dumping result, returning 30583 1726853787.93399: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-0000000024b2] 30583 1726853787.93404: sending task result for task 02083763-bbaf-05ea-abc5-0000000024b2 30583 1726853787.93490: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024b2 30583 1726853787.93493: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853787.93534: no more pending results, returning what we have 30583 1726853787.93538: results queue empty 30583 1726853787.93539: checking for any_errors_fatal 30583 1726853787.93549: done checking for any_errors_fatal 30583 1726853787.93549: checking for max_fail_percentage 30583 1726853787.93551: done checking for max_fail_percentage 30583 1726853787.93552: checking to see if all hosts have failed and the running result is not ok 30583 1726853787.93553: done checking to see if all hosts have failed 30583 1726853787.93554: getting the remaining hosts for this loop 30583 1726853787.93555: done getting the remaining hosts for this loop 30583 1726853787.93560: getting the next task for host managed_node2 30583 1726853787.93567: done getting next task for host managed_node2 30583 1726853787.93573: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853787.93577: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853787.93604: getting variables 30583 1726853787.93606: in VariableManager get_vars() 30583 1726853787.93646: Calling all_inventory to load vars for managed_node2 30583 1726853787.93648: Calling groups_inventory to load vars for managed_node2 30583 1726853787.93650: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853787.93658: Calling all_plugins_play to load vars for managed_node2 30583 1726853787.93661: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853787.93663: Calling groups_plugins_play to load vars for managed_node2 30583 1726853787.94566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853787.95415: done with get_vars() 30583 1726853787.95432: done getting variables 30583 1726853787.95476: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:36:27 -0400 (0:00:00.032) 0:02:03.292 ****** 30583 1726853787.95503: entering _queue_task() for managed_node2/copy 30583 1726853787.95750: worker is 1 (out of 1 available) 30583 1726853787.95763: exiting _queue_task() for managed_node2/copy 30583 1726853787.95777: done queuing things up, now waiting for results queue to drain 30583 1726853787.95779: waiting for pending results... 30583 1726853787.95975: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853787.96073: in run() - task 02083763-bbaf-05ea-abc5-0000000024b3 30583 1726853787.96084: variable 'ansible_search_path' from source: unknown 30583 1726853787.96087: variable 'ansible_search_path' from source: unknown 30583 1726853787.96120: calling self._execute() 30583 1726853787.96196: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853787.96200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853787.96210: variable 'omit' from source: magic vars 30583 1726853787.96495: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.96504: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853787.96588: variable 'network_provider' from source: set_fact 30583 1726853787.96592: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853787.96595: when evaluation is False, skipping this task 30583 1726853787.96598: _execute() done 30583 1726853787.96601: dumping result to json 30583 1726853787.96605: done dumping result, returning 30583 1726853787.96613: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-0000000024b3] 30583 1726853787.96616: sending task result for task 02083763-bbaf-05ea-abc5-0000000024b3 30583 1726853787.96707: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024b3 30583 1726853787.96710: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853787.96756: no more pending results, returning what we have 30583 1726853787.96760: results queue empty 30583 1726853787.96761: checking for any_errors_fatal 30583 1726853787.96769: done checking for any_errors_fatal 30583 1726853787.96770: checking for max_fail_percentage 30583 1726853787.96774: done checking for max_fail_percentage 30583 1726853787.96775: checking to see if all hosts have failed and the running result is not ok 30583 1726853787.96776: done checking to see if all hosts have failed 30583 1726853787.96776: getting the remaining hosts for this loop 30583 1726853787.96778: done getting the remaining hosts for this loop 30583 1726853787.96782: getting the next task for host managed_node2 30583 1726853787.96791: done getting next task for host managed_node2 30583 1726853787.96794: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853787.96799: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853787.96829: getting variables 30583 1726853787.96831: in VariableManager get_vars() 30583 1726853787.96876: Calling all_inventory to load vars for managed_node2 30583 1726853787.96879: Calling groups_inventory to load vars for managed_node2 30583 1726853787.96882: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853787.96891: Calling all_plugins_play to load vars for managed_node2 30583 1726853787.96894: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853787.96896: Calling groups_plugins_play to load vars for managed_node2 30583 1726853787.97686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853787.98554: done with get_vars() 30583 1726853787.98573: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:36:27 -0400 (0:00:00.031) 0:02:03.323 ****** 30583 1726853787.98636: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853787.98894: worker is 1 (out of 1 available) 30583 1726853787.98907: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853787.98921: done queuing things up, now waiting for results queue to drain 30583 1726853787.98922: waiting for pending results... 30583 1726853787.99120: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853787.99214: in run() - task 02083763-bbaf-05ea-abc5-0000000024b4 30583 1726853787.99225: variable 'ansible_search_path' from source: unknown 30583 1726853787.99229: variable 'ansible_search_path' from source: unknown 30583 1726853787.99260: calling self._execute() 30583 1726853787.99341: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853787.99345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853787.99354: variable 'omit' from source: magic vars 30583 1726853787.99645: variable 'ansible_distribution_major_version' from source: facts 30583 1726853787.99654: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853787.99662: variable 'omit' from source: magic vars 30583 1726853787.99712: variable 'omit' from source: magic vars 30583 1726853787.99825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853788.01495: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853788.01541: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853788.01572: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853788.01598: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853788.01618: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853788.01682: variable 'network_provider' from source: set_fact 30583 1726853788.01775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853788.01796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853788.01814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853788.01840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853788.01851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853788.01910: variable 'omit' from source: magic vars 30583 1726853788.01988: variable 'omit' from source: magic vars 30583 1726853788.02051: variable 'network_connections' from source: include params 30583 1726853788.02063: variable 'interface' from source: play vars 30583 1726853788.02110: variable 'interface' from source: play vars 30583 1726853788.02211: variable 'omit' from source: magic vars 30583 1726853788.02218: variable '__lsr_ansible_managed' from source: task vars 30583 1726853788.02263: variable '__lsr_ansible_managed' from source: task vars 30583 1726853788.02398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853788.02538: Loaded config def from plugin (lookup/template) 30583 1726853788.02541: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853788.02642: File lookup term: get_ansible_managed.j2 30583 1726853788.02645: variable 'ansible_search_path' from source: unknown 30583 1726853788.02648: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853788.02653: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853788.02656: variable 'ansible_search_path' from source: unknown 30583 1726853788.05862: variable 'ansible_managed' from source: unknown 30583 1726853788.05945: variable 'omit' from source: magic vars 30583 1726853788.05967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853788.05992: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853788.06008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853788.06021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853788.06029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853788.06052: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853788.06055: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853788.06061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853788.06126: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853788.06131: Set connection var ansible_timeout to 10 30583 1726853788.06134: Set connection var ansible_connection to ssh 30583 1726853788.06138: Set connection var ansible_shell_executable to /bin/sh 30583 1726853788.06141: Set connection var ansible_shell_type to sh 30583 1726853788.06148: Set connection var ansible_pipelining to False 30583 1726853788.06168: variable 'ansible_shell_executable' from source: unknown 30583 1726853788.06172: variable 'ansible_connection' from source: unknown 30583 1726853788.06175: variable 'ansible_module_compression' from source: unknown 30583 1726853788.06177: variable 'ansible_shell_type' from source: unknown 30583 1726853788.06179: variable 'ansible_shell_executable' from source: unknown 30583 1726853788.06183: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853788.06185: variable 'ansible_pipelining' from source: unknown 30583 1726853788.06187: variable 'ansible_timeout' from source: unknown 30583 1726853788.06191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853788.06284: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853788.06295: variable 'omit' from source: magic vars 30583 1726853788.06298: starting attempt loop 30583 1726853788.06300: running the handler 30583 1726853788.06314: _low_level_execute_command(): starting 30583 1726853788.06320: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853788.06820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.06823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.06826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.06828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.06877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853788.06882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853788.06894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853788.06980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853788.08708: stdout chunk (state=3): >>>/root <<< 30583 1726853788.08803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853788.08833: stderr chunk (state=3): >>><<< 30583 1726853788.08837: stdout chunk (state=3): >>><<< 30583 1726853788.08855: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853788.08868: _low_level_execute_command(): starting 30583 1726853788.08876: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100 `" && echo ansible-tmp-1726853788.088561-35956-148335245694100="` echo /root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100 `" ) && sleep 0' 30583 1726853788.09327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853788.09330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853788.09332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.09334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.09336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.09387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853788.09390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853788.09468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853788.11463: stdout chunk (state=3): >>>ansible-tmp-1726853788.088561-35956-148335245694100=/root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100 <<< 30583 1726853788.11564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853788.11599: stderr chunk (state=3): >>><<< 30583 1726853788.11602: stdout chunk (state=3): >>><<< 30583 1726853788.11619: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853788.088561-35956-148335245694100=/root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853788.11663: variable 'ansible_module_compression' from source: unknown 30583 1726853788.11705: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853788.11730: variable 'ansible_facts' from source: unknown 30583 1726853788.11798: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100/AnsiballZ_network_connections.py 30583 1726853788.11901: Sending initial data 30583 1726853788.11905: Sent initial data (167 bytes) 30583 1726853788.12353: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853788.12356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853788.12366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.12369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.12377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.12418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853788.12424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853788.12427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853788.12499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853788.14146: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30583 1726853788.14150: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853788.14211: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853788.14282: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpfpocxa89 /root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100/AnsiballZ_network_connections.py <<< 30583 1726853788.14287: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100/AnsiballZ_network_connections.py" <<< 30583 1726853788.14356: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpfpocxa89" to remote "/root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100/AnsiballZ_network_connections.py" <<< 30583 1726853788.14361: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100/AnsiballZ_network_connections.py" <<< 30583 1726853788.15212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853788.15253: stderr chunk (state=3): >>><<< 30583 1726853788.15256: stdout chunk (state=3): >>><<< 30583 1726853788.15285: done transferring module to remote 30583 1726853788.15294: _low_level_execute_command(): starting 30583 1726853788.15299: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100/ /root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100/AnsiballZ_network_connections.py && sleep 0' 30583 1726853788.15745: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.15748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853788.15750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.15752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.15754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.15810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853788.15815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853788.15818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853788.15882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853788.17741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853788.17769: stderr chunk (state=3): >>><<< 30583 1726853788.17775: stdout chunk (state=3): >>><<< 30583 1726853788.17787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853788.17790: _low_level_execute_command(): starting 30583 1726853788.17795: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100/AnsiballZ_network_connections.py && sleep 0' 30583 1726853788.18229: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853788.18232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.18236: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853788.18238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.18240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.18295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853788.18301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853788.18304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853788.18384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853788.52118: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_wcuus01d/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_wcuus01d/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/3512d7ba-d156-408a-9044-dcd593676efd: error=unknown <<< 30583 1726853788.52294: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 30583 1726853788.52481: stdout chunk (state=3): >>> <<< 30583 1726853788.54264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853788.54297: stderr chunk (state=3): >>><<< 30583 1726853788.54300: stdout chunk (state=3): >>><<< 30583 1726853788.54316: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_wcuus01d/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_wcuus01d/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/3512d7ba-d156-408a-9044-dcd593676efd: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853788.54346: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853788.54354: _low_level_execute_command(): starting 30583 1726853788.54358: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853788.088561-35956-148335245694100/ > /dev/null 2>&1 && sleep 0' 30583 1726853788.54808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.54812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853788.54814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.54816: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.54818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.54870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853788.54878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853788.54944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853788.56891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853788.56915: stderr chunk (state=3): >>><<< 30583 1726853788.56918: stdout chunk (state=3): >>><<< 30583 1726853788.56930: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853788.56939: handler run complete 30583 1726853788.56961: attempt loop complete, returning result 30583 1726853788.56964: _execute() done 30583 1726853788.56966: dumping result to json 30583 1726853788.56968: done dumping result, returning 30583 1726853788.56978: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-0000000024b4] 30583 1726853788.56982: sending task result for task 02083763-bbaf-05ea-abc5-0000000024b4 30583 1726853788.57083: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024b4 30583 1726853788.57086: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30583 1726853788.57187: no more pending results, returning what we have 30583 1726853788.57190: results queue empty 30583 1726853788.57192: checking for any_errors_fatal 30583 1726853788.57202: done checking for any_errors_fatal 30583 1726853788.57203: checking for max_fail_percentage 30583 1726853788.57205: done checking for max_fail_percentage 30583 1726853788.57206: checking to see if all hosts have failed and the running result is not ok 30583 1726853788.57207: done checking to see if all hosts have failed 30583 1726853788.57207: getting the remaining hosts for this loop 30583 1726853788.57209: done getting the remaining hosts for this loop 30583 1726853788.57212: getting the next task for host managed_node2 30583 1726853788.57219: done getting next task for host managed_node2 30583 1726853788.57222: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853788.57226: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853788.57239: getting variables 30583 1726853788.57240: in VariableManager get_vars() 30583 1726853788.57286: Calling all_inventory to load vars for managed_node2 30583 1726853788.57288: Calling groups_inventory to load vars for managed_node2 30583 1726853788.57291: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853788.57299: Calling all_plugins_play to load vars for managed_node2 30583 1726853788.57302: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853788.57309: Calling groups_plugins_play to load vars for managed_node2 30583 1726853788.58241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853788.59110: done with get_vars() 30583 1726853788.59126: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:36:28 -0400 (0:00:00.605) 0:02:03.929 ****** 30583 1726853788.59193: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853788.59441: worker is 1 (out of 1 available) 30583 1726853788.59454: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853788.59468: done queuing things up, now waiting for results queue to drain 30583 1726853788.59470: waiting for pending results... 30583 1726853788.59661: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853788.59762: in run() - task 02083763-bbaf-05ea-abc5-0000000024b5 30583 1726853788.59773: variable 'ansible_search_path' from source: unknown 30583 1726853788.59777: variable 'ansible_search_path' from source: unknown 30583 1726853788.59808: calling self._execute() 30583 1726853788.59887: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853788.59892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853788.59900: variable 'omit' from source: magic vars 30583 1726853788.60187: variable 'ansible_distribution_major_version' from source: facts 30583 1726853788.60196: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853788.60282: variable 'network_state' from source: role '' defaults 30583 1726853788.60290: Evaluated conditional (network_state != {}): False 30583 1726853788.60294: when evaluation is False, skipping this task 30583 1726853788.60296: _execute() done 30583 1726853788.60299: dumping result to json 30583 1726853788.60301: done dumping result, returning 30583 1726853788.60308: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-0000000024b5] 30583 1726853788.60312: sending task result for task 02083763-bbaf-05ea-abc5-0000000024b5 30583 1726853788.60400: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024b5 30583 1726853788.60403: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853788.60455: no more pending results, returning what we have 30583 1726853788.60462: results queue empty 30583 1726853788.60463: checking for any_errors_fatal 30583 1726853788.60478: done checking for any_errors_fatal 30583 1726853788.60479: checking for max_fail_percentage 30583 1726853788.60480: done checking for max_fail_percentage 30583 1726853788.60481: checking to see if all hosts have failed and the running result is not ok 30583 1726853788.60482: done checking to see if all hosts have failed 30583 1726853788.60483: getting the remaining hosts for this loop 30583 1726853788.60485: done getting the remaining hosts for this loop 30583 1726853788.60488: getting the next task for host managed_node2 30583 1726853788.60495: done getting next task for host managed_node2 30583 1726853788.60499: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853788.60503: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853788.60529: getting variables 30583 1726853788.60531: in VariableManager get_vars() 30583 1726853788.60569: Calling all_inventory to load vars for managed_node2 30583 1726853788.60576: Calling groups_inventory to load vars for managed_node2 30583 1726853788.60579: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853788.60587: Calling all_plugins_play to load vars for managed_node2 30583 1726853788.60589: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853788.60592: Calling groups_plugins_play to load vars for managed_node2 30583 1726853788.61363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853788.62340: done with get_vars() 30583 1726853788.62355: done getting variables 30583 1726853788.62402: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:36:28 -0400 (0:00:00.032) 0:02:03.961 ****** 30583 1726853788.62427: entering _queue_task() for managed_node2/debug 30583 1726853788.62662: worker is 1 (out of 1 available) 30583 1726853788.62680: exiting _queue_task() for managed_node2/debug 30583 1726853788.62693: done queuing things up, now waiting for results queue to drain 30583 1726853788.62694: waiting for pending results... 30583 1726853788.62880: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853788.62969: in run() - task 02083763-bbaf-05ea-abc5-0000000024b6 30583 1726853788.62983: variable 'ansible_search_path' from source: unknown 30583 1726853788.62987: variable 'ansible_search_path' from source: unknown 30583 1726853788.63015: calling self._execute() 30583 1726853788.63095: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853788.63099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853788.63106: variable 'omit' from source: magic vars 30583 1726853788.63386: variable 'ansible_distribution_major_version' from source: facts 30583 1726853788.63395: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853788.63401: variable 'omit' from source: magic vars 30583 1726853788.63442: variable 'omit' from source: magic vars 30583 1726853788.63469: variable 'omit' from source: magic vars 30583 1726853788.63505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853788.63533: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853788.63549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853788.63563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853788.63578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853788.63601: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853788.63604: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853788.63607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853788.63678: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853788.63688: Set connection var ansible_timeout to 10 30583 1726853788.63691: Set connection var ansible_connection to ssh 30583 1726853788.63693: Set connection var ansible_shell_executable to /bin/sh 30583 1726853788.63695: Set connection var ansible_shell_type to sh 30583 1726853788.63702: Set connection var ansible_pipelining to False 30583 1726853788.63719: variable 'ansible_shell_executable' from source: unknown 30583 1726853788.63722: variable 'ansible_connection' from source: unknown 30583 1726853788.63725: variable 'ansible_module_compression' from source: unknown 30583 1726853788.63727: variable 'ansible_shell_type' from source: unknown 30583 1726853788.63729: variable 'ansible_shell_executable' from source: unknown 30583 1726853788.63732: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853788.63736: variable 'ansible_pipelining' from source: unknown 30583 1726853788.63738: variable 'ansible_timeout' from source: unknown 30583 1726853788.63742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853788.63845: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853788.63853: variable 'omit' from source: magic vars 30583 1726853788.63861: starting attempt loop 30583 1726853788.63864: running the handler 30583 1726853788.63963: variable '__network_connections_result' from source: set_fact 30583 1726853788.64003: handler run complete 30583 1726853788.64019: attempt loop complete, returning result 30583 1726853788.64022: _execute() done 30583 1726853788.64025: dumping result to json 30583 1726853788.64027: done dumping result, returning 30583 1726853788.64034: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-0000000024b6] 30583 1726853788.64038: sending task result for task 02083763-bbaf-05ea-abc5-0000000024b6 30583 1726853788.64125: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024b6 30583 1726853788.64128: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 30583 1726853788.64204: no more pending results, returning what we have 30583 1726853788.64207: results queue empty 30583 1726853788.64208: checking for any_errors_fatal 30583 1726853788.64214: done checking for any_errors_fatal 30583 1726853788.64215: checking for max_fail_percentage 30583 1726853788.64216: done checking for max_fail_percentage 30583 1726853788.64217: checking to see if all hosts have failed and the running result is not ok 30583 1726853788.64218: done checking to see if all hosts have failed 30583 1726853788.64219: getting the remaining hosts for this loop 30583 1726853788.64221: done getting the remaining hosts for this loop 30583 1726853788.64224: getting the next task for host managed_node2 30583 1726853788.64232: done getting next task for host managed_node2 30583 1726853788.64236: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853788.64240: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853788.64252: getting variables 30583 1726853788.64253: in VariableManager get_vars() 30583 1726853788.64297: Calling all_inventory to load vars for managed_node2 30583 1726853788.64299: Calling groups_inventory to load vars for managed_node2 30583 1726853788.64301: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853788.64310: Calling all_plugins_play to load vars for managed_node2 30583 1726853788.64312: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853788.64314: Calling groups_plugins_play to load vars for managed_node2 30583 1726853788.65094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853788.65956: done with get_vars() 30583 1726853788.65976: done getting variables 30583 1726853788.66018: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:36:28 -0400 (0:00:00.036) 0:02:03.997 ****** 30583 1726853788.66046: entering _queue_task() for managed_node2/debug 30583 1726853788.66285: worker is 1 (out of 1 available) 30583 1726853788.66300: exiting _queue_task() for managed_node2/debug 30583 1726853788.66313: done queuing things up, now waiting for results queue to drain 30583 1726853788.66314: waiting for pending results... 30583 1726853788.66504: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853788.66596: in run() - task 02083763-bbaf-05ea-abc5-0000000024b7 30583 1726853788.66608: variable 'ansible_search_path' from source: unknown 30583 1726853788.66612: variable 'ansible_search_path' from source: unknown 30583 1726853788.66640: calling self._execute() 30583 1726853788.66721: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853788.66725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853788.66733: variable 'omit' from source: magic vars 30583 1726853788.67018: variable 'ansible_distribution_major_version' from source: facts 30583 1726853788.67028: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853788.67033: variable 'omit' from source: magic vars 30583 1726853788.67081: variable 'omit' from source: magic vars 30583 1726853788.67107: variable 'omit' from source: magic vars 30583 1726853788.67139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853788.67166: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853788.67184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853788.67199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853788.67209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853788.67233: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853788.67236: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853788.67239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853788.67377: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853788.67380: Set connection var ansible_timeout to 10 30583 1726853788.67382: Set connection var ansible_connection to ssh 30583 1726853788.67384: Set connection var ansible_shell_executable to /bin/sh 30583 1726853788.67387: Set connection var ansible_shell_type to sh 30583 1726853788.67389: Set connection var ansible_pipelining to False 30583 1726853788.67391: variable 'ansible_shell_executable' from source: unknown 30583 1726853788.67393: variable 'ansible_connection' from source: unknown 30583 1726853788.67396: variable 'ansible_module_compression' from source: unknown 30583 1726853788.67398: variable 'ansible_shell_type' from source: unknown 30583 1726853788.67400: variable 'ansible_shell_executable' from source: unknown 30583 1726853788.67401: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853788.67403: variable 'ansible_pipelining' from source: unknown 30583 1726853788.67405: variable 'ansible_timeout' from source: unknown 30583 1726853788.67410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853788.67461: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853788.67469: variable 'omit' from source: magic vars 30583 1726853788.67476: starting attempt loop 30583 1726853788.67479: running the handler 30583 1726853788.67516: variable '__network_connections_result' from source: set_fact 30583 1726853788.67573: variable '__network_connections_result' from source: set_fact 30583 1726853788.67648: handler run complete 30583 1726853788.67667: attempt loop complete, returning result 30583 1726853788.67670: _execute() done 30583 1726853788.67674: dumping result to json 30583 1726853788.67677: done dumping result, returning 30583 1726853788.67684: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-0000000024b7] 30583 1726853788.67687: sending task result for task 02083763-bbaf-05ea-abc5-0000000024b7 30583 1726853788.67775: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024b7 30583 1726853788.67778: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30583 1726853788.67869: no more pending results, returning what we have 30583 1726853788.67874: results queue empty 30583 1726853788.67875: checking for any_errors_fatal 30583 1726853788.67880: done checking for any_errors_fatal 30583 1726853788.67881: checking for max_fail_percentage 30583 1726853788.67882: done checking for max_fail_percentage 30583 1726853788.67883: checking to see if all hosts have failed and the running result is not ok 30583 1726853788.67884: done checking to see if all hosts have failed 30583 1726853788.67885: getting the remaining hosts for this loop 30583 1726853788.67888: done getting the remaining hosts for this loop 30583 1726853788.67891: getting the next task for host managed_node2 30583 1726853788.67898: done getting next task for host managed_node2 30583 1726853788.67901: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853788.67905: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853788.67916: getting variables 30583 1726853788.67917: in VariableManager get_vars() 30583 1726853788.67952: Calling all_inventory to load vars for managed_node2 30583 1726853788.67955: Calling groups_inventory to load vars for managed_node2 30583 1726853788.67957: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853788.67967: Calling all_plugins_play to load vars for managed_node2 30583 1726853788.67969: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853788.67983: Calling groups_plugins_play to load vars for managed_node2 30583 1726853788.68885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853788.69732: done with get_vars() 30583 1726853788.69747: done getting variables 30583 1726853788.69792: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:36:28 -0400 (0:00:00.037) 0:02:04.035 ****** 30583 1726853788.69817: entering _queue_task() for managed_node2/debug 30583 1726853788.70051: worker is 1 (out of 1 available) 30583 1726853788.70068: exiting _queue_task() for managed_node2/debug 30583 1726853788.70082: done queuing things up, now waiting for results queue to drain 30583 1726853788.70083: waiting for pending results... 30583 1726853788.70272: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853788.70369: in run() - task 02083763-bbaf-05ea-abc5-0000000024b8 30583 1726853788.70384: variable 'ansible_search_path' from source: unknown 30583 1726853788.70388: variable 'ansible_search_path' from source: unknown 30583 1726853788.70416: calling self._execute() 30583 1726853788.70496: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853788.70500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853788.70509: variable 'omit' from source: magic vars 30583 1726853788.70800: variable 'ansible_distribution_major_version' from source: facts 30583 1726853788.70809: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853788.70895: variable 'network_state' from source: role '' defaults 30583 1726853788.70902: Evaluated conditional (network_state != {}): False 30583 1726853788.70906: when evaluation is False, skipping this task 30583 1726853788.70908: _execute() done 30583 1726853788.70911: dumping result to json 30583 1726853788.70913: done dumping result, returning 30583 1726853788.70921: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-0000000024b8] 30583 1726853788.70926: sending task result for task 02083763-bbaf-05ea-abc5-0000000024b8 30583 1726853788.71017: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024b8 30583 1726853788.71020: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853788.71107: no more pending results, returning what we have 30583 1726853788.71111: results queue empty 30583 1726853788.71111: checking for any_errors_fatal 30583 1726853788.71119: done checking for any_errors_fatal 30583 1726853788.71119: checking for max_fail_percentage 30583 1726853788.71121: done checking for max_fail_percentage 30583 1726853788.71122: checking to see if all hosts have failed and the running result is not ok 30583 1726853788.71122: done checking to see if all hosts have failed 30583 1726853788.71123: getting the remaining hosts for this loop 30583 1726853788.71125: done getting the remaining hosts for this loop 30583 1726853788.71128: getting the next task for host managed_node2 30583 1726853788.71135: done getting next task for host managed_node2 30583 1726853788.71139: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853788.71143: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853788.71169: getting variables 30583 1726853788.71172: in VariableManager get_vars() 30583 1726853788.71208: Calling all_inventory to load vars for managed_node2 30583 1726853788.71210: Calling groups_inventory to load vars for managed_node2 30583 1726853788.71212: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853788.71220: Calling all_plugins_play to load vars for managed_node2 30583 1726853788.71222: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853788.71224: Calling groups_plugins_play to load vars for managed_node2 30583 1726853788.71986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853788.72852: done with get_vars() 30583 1726853788.72872: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:36:28 -0400 (0:00:00.031) 0:02:04.066 ****** 30583 1726853788.72941: entering _queue_task() for managed_node2/ping 30583 1726853788.73188: worker is 1 (out of 1 available) 30583 1726853788.73202: exiting _queue_task() for managed_node2/ping 30583 1726853788.73214: done queuing things up, now waiting for results queue to drain 30583 1726853788.73215: waiting for pending results... 30583 1726853788.73414: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853788.73506: in run() - task 02083763-bbaf-05ea-abc5-0000000024b9 30583 1726853788.73518: variable 'ansible_search_path' from source: unknown 30583 1726853788.73522: variable 'ansible_search_path' from source: unknown 30583 1726853788.73550: calling self._execute() 30583 1726853788.73638: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853788.73642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853788.73651: variable 'omit' from source: magic vars 30583 1726853788.73938: variable 'ansible_distribution_major_version' from source: facts 30583 1726853788.73948: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853788.73953: variable 'omit' from source: magic vars 30583 1726853788.74001: variable 'omit' from source: magic vars 30583 1726853788.74024: variable 'omit' from source: magic vars 30583 1726853788.74057: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853788.74086: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853788.74104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853788.74118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853788.74128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853788.74151: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853788.74154: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853788.74157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853788.74230: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853788.74236: Set connection var ansible_timeout to 10 30583 1726853788.74239: Set connection var ansible_connection to ssh 30583 1726853788.74243: Set connection var ansible_shell_executable to /bin/sh 30583 1726853788.74246: Set connection var ansible_shell_type to sh 30583 1726853788.74253: Set connection var ansible_pipelining to False 30583 1726853788.74274: variable 'ansible_shell_executable' from source: unknown 30583 1726853788.74277: variable 'ansible_connection' from source: unknown 30583 1726853788.74280: variable 'ansible_module_compression' from source: unknown 30583 1726853788.74282: variable 'ansible_shell_type' from source: unknown 30583 1726853788.74284: variable 'ansible_shell_executable' from source: unknown 30583 1726853788.74286: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853788.74288: variable 'ansible_pipelining' from source: unknown 30583 1726853788.74292: variable 'ansible_timeout' from source: unknown 30583 1726853788.74296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853788.74440: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853788.74449: variable 'omit' from source: magic vars 30583 1726853788.74454: starting attempt loop 30583 1726853788.74457: running the handler 30583 1726853788.74469: _low_level_execute_command(): starting 30583 1726853788.74477: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853788.74988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.74992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.74995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.74997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.75049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853788.75052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853788.75054: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853788.75135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853788.76882: stdout chunk (state=3): >>>/root <<< 30583 1726853788.77166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853788.77169: stdout chunk (state=3): >>><<< 30583 1726853788.77174: stderr chunk (state=3): >>><<< 30583 1726853788.77199: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853788.77295: _low_level_execute_command(): starting 30583 1726853788.77299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088 `" && echo ansible-tmp-1726853788.7720642-35968-28852616904088="` echo /root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088 `" ) && sleep 0' 30583 1726853788.77845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853788.77859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853788.77887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853788.77964: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.78014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853788.78032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853788.78082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853788.78158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853788.80149: stdout chunk (state=3): >>>ansible-tmp-1726853788.7720642-35968-28852616904088=/root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088 <<< 30583 1726853788.80256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853788.80282: stderr chunk (state=3): >>><<< 30583 1726853788.80286: stdout chunk (state=3): >>><<< 30583 1726853788.80300: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853788.7720642-35968-28852616904088=/root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853788.80339: variable 'ansible_module_compression' from source: unknown 30583 1726853788.80375: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853788.80404: variable 'ansible_facts' from source: unknown 30583 1726853788.80456: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088/AnsiballZ_ping.py 30583 1726853788.80554: Sending initial data 30583 1726853788.80560: Sent initial data (152 bytes) 30583 1726853788.80992: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853788.80995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853788.80998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.81000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.81002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.81052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853788.81058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853788.81060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853788.81125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853788.82796: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853788.82800: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853788.82862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853788.82934: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpcnv_pgl0 /root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088/AnsiballZ_ping.py <<< 30583 1726853788.82937: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088/AnsiballZ_ping.py" <<< 30583 1726853788.83000: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpcnv_pgl0" to remote "/root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088/AnsiballZ_ping.py" <<< 30583 1726853788.83633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853788.83670: stderr chunk (state=3): >>><<< 30583 1726853788.83680: stdout chunk (state=3): >>><<< 30583 1726853788.83716: done transferring module to remote 30583 1726853788.83725: _low_level_execute_command(): starting 30583 1726853788.83730: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088/ /root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088/AnsiballZ_ping.py && sleep 0' 30583 1726853788.84142: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.84145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.84147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853788.84149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.84198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853788.84202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853788.84278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853788.86156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853788.86182: stderr chunk (state=3): >>><<< 30583 1726853788.86186: stdout chunk (state=3): >>><<< 30583 1726853788.86200: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853788.86203: _low_level_execute_command(): starting 30583 1726853788.86206: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088/AnsiballZ_ping.py && sleep 0' 30583 1726853788.86644: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853788.86647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853788.86650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.86652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853788.86654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853788.86656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853788.86704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853788.86713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853788.86786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853789.02374: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853789.03774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853789.03804: stderr chunk (state=3): >>><<< 30583 1726853789.03807: stdout chunk (state=3): >>><<< 30583 1726853789.03823: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853789.03846: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853789.03856: _low_level_execute_command(): starting 30583 1726853789.03861: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853788.7720642-35968-28852616904088/ > /dev/null 2>&1 && sleep 0' 30583 1726853789.04320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853789.04324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853789.04326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853789.04328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853789.04330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853789.04384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853789.04387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853789.04389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853789.04465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853789.06350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853789.06378: stderr chunk (state=3): >>><<< 30583 1726853789.06381: stdout chunk (state=3): >>><<< 30583 1726853789.06397: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853789.06405: handler run complete 30583 1726853789.06415: attempt loop complete, returning result 30583 1726853789.06418: _execute() done 30583 1726853789.06420: dumping result to json 30583 1726853789.06423: done dumping result, returning 30583 1726853789.06431: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-0000000024b9] 30583 1726853789.06436: sending task result for task 02083763-bbaf-05ea-abc5-0000000024b9 30583 1726853789.06527: done sending task result for task 02083763-bbaf-05ea-abc5-0000000024b9 30583 1726853789.06530: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853789.06606: no more pending results, returning what we have 30583 1726853789.06609: results queue empty 30583 1726853789.06610: checking for any_errors_fatal 30583 1726853789.06618: done checking for any_errors_fatal 30583 1726853789.06619: checking for max_fail_percentage 30583 1726853789.06621: done checking for max_fail_percentage 30583 1726853789.06622: checking to see if all hosts have failed and the running result is not ok 30583 1726853789.06622: done checking to see if all hosts have failed 30583 1726853789.06623: getting the remaining hosts for this loop 30583 1726853789.06625: done getting the remaining hosts for this loop 30583 1726853789.06628: getting the next task for host managed_node2 30583 1726853789.06643: done getting next task for host managed_node2 30583 1726853789.06646: ^ task is: TASK: meta (role_complete) 30583 1726853789.06650: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853789.06665: getting variables 30583 1726853789.06667: in VariableManager get_vars() 30583 1726853789.06718: Calling all_inventory to load vars for managed_node2 30583 1726853789.06721: Calling groups_inventory to load vars for managed_node2 30583 1726853789.06723: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853789.06732: Calling all_plugins_play to load vars for managed_node2 30583 1726853789.06735: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853789.06737: Calling groups_plugins_play to load vars for managed_node2 30583 1726853789.07665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853789.08520: done with get_vars() 30583 1726853789.08538: done getting variables 30583 1726853789.08598: done queuing things up, now waiting for results queue to drain 30583 1726853789.08600: results queue empty 30583 1726853789.08600: checking for any_errors_fatal 30583 1726853789.08602: done checking for any_errors_fatal 30583 1726853789.08603: checking for max_fail_percentage 30583 1726853789.08604: done checking for max_fail_percentage 30583 1726853789.08604: checking to see if all hosts have failed and the running result is not ok 30583 1726853789.08604: done checking to see if all hosts have failed 30583 1726853789.08605: getting the remaining hosts for this loop 30583 1726853789.08606: done getting the remaining hosts for this loop 30583 1726853789.08607: getting the next task for host managed_node2 30583 1726853789.08611: done getting next task for host managed_node2 30583 1726853789.08613: ^ task is: TASK: Test 30583 1726853789.08614: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853789.08616: getting variables 30583 1726853789.08617: in VariableManager get_vars() 30583 1726853789.08626: Calling all_inventory to load vars for managed_node2 30583 1726853789.08628: Calling groups_inventory to load vars for managed_node2 30583 1726853789.08629: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853789.08633: Calling all_plugins_play to load vars for managed_node2 30583 1726853789.08635: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853789.08637: Calling groups_plugins_play to load vars for managed_node2 30583 1726853789.09260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853789.10181: done with get_vars() 30583 1726853789.10195: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 13:36:29 -0400 (0:00:00.373) 0:02:04.439 ****** 30583 1726853789.10248: entering _queue_task() for managed_node2/include_tasks 30583 1726853789.10518: worker is 1 (out of 1 available) 30583 1726853789.10532: exiting _queue_task() for managed_node2/include_tasks 30583 1726853789.10547: done queuing things up, now waiting for results queue to drain 30583 1726853789.10548: waiting for pending results... 30583 1726853789.10743: running TaskExecutor() for managed_node2/TASK: Test 30583 1726853789.10833: in run() - task 02083763-bbaf-05ea-abc5-0000000020b1 30583 1726853789.10844: variable 'ansible_search_path' from source: unknown 30583 1726853789.10848: variable 'ansible_search_path' from source: unknown 30583 1726853789.10895: variable 'lsr_test' from source: include params 30583 1726853789.11057: variable 'lsr_test' from source: include params 30583 1726853789.11116: variable 'omit' from source: magic vars 30583 1726853789.11221: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853789.11229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853789.11238: variable 'omit' from source: magic vars 30583 1726853789.11406: variable 'ansible_distribution_major_version' from source: facts 30583 1726853789.11415: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853789.11418: variable 'item' from source: unknown 30583 1726853789.11468: variable 'item' from source: unknown 30583 1726853789.11490: variable 'item' from source: unknown 30583 1726853789.11533: variable 'item' from source: unknown 30583 1726853789.11656: dumping result to json 30583 1726853789.11660: done dumping result, returning 30583 1726853789.11662: done running TaskExecutor() for managed_node2/TASK: Test [02083763-bbaf-05ea-abc5-0000000020b1] 30583 1726853789.11664: sending task result for task 02083763-bbaf-05ea-abc5-0000000020b1 30583 1726853789.11702: done sending task result for task 02083763-bbaf-05ea-abc5-0000000020b1 30583 1726853789.11705: WORKER PROCESS EXITING 30583 1726853789.11724: no more pending results, returning what we have 30583 1726853789.11728: in VariableManager get_vars() 30583 1726853789.11781: Calling all_inventory to load vars for managed_node2 30583 1726853789.11784: Calling groups_inventory to load vars for managed_node2 30583 1726853789.11787: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853789.11798: Calling all_plugins_play to load vars for managed_node2 30583 1726853789.11801: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853789.11803: Calling groups_plugins_play to load vars for managed_node2 30583 1726853789.12565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853789.13406: done with get_vars() 30583 1726853789.13420: variable 'ansible_search_path' from source: unknown 30583 1726853789.13421: variable 'ansible_search_path' from source: unknown 30583 1726853789.13449: we have included files to process 30583 1726853789.13450: generating all_blocks data 30583 1726853789.13451: done generating all_blocks data 30583 1726853789.13455: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30583 1726853789.13456: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30583 1726853789.13457: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30583 1726853789.13540: done processing included file 30583 1726853789.13541: iterating over new_blocks loaded from include file 30583 1726853789.13542: in VariableManager get_vars() 30583 1726853789.13553: done with get_vars() 30583 1726853789.13554: filtering new block on tags 30583 1726853789.13573: done filtering new block on tags 30583 1726853789.13575: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node2 => (item=tasks/remove+down_profile.yml) 30583 1726853789.13579: extending task lists for all hosts with included blocks 30583 1726853789.14103: done extending task lists 30583 1726853789.14104: done processing included files 30583 1726853789.14105: results queue empty 30583 1726853789.14105: checking for any_errors_fatal 30583 1726853789.14107: done checking for any_errors_fatal 30583 1726853789.14107: checking for max_fail_percentage 30583 1726853789.14108: done checking for max_fail_percentage 30583 1726853789.14109: checking to see if all hosts have failed and the running result is not ok 30583 1726853789.14109: done checking to see if all hosts have failed 30583 1726853789.14110: getting the remaining hosts for this loop 30583 1726853789.14111: done getting the remaining hosts for this loop 30583 1726853789.14112: getting the next task for host managed_node2 30583 1726853789.14115: done getting next task for host managed_node2 30583 1726853789.14116: ^ task is: TASK: Include network role 30583 1726853789.14118: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853789.14120: getting variables 30583 1726853789.14121: in VariableManager get_vars() 30583 1726853789.14129: Calling all_inventory to load vars for managed_node2 30583 1726853789.14131: Calling groups_inventory to load vars for managed_node2 30583 1726853789.14132: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853789.14136: Calling all_plugins_play to load vars for managed_node2 30583 1726853789.14137: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853789.14139: Calling groups_plugins_play to load vars for managed_node2 30583 1726853789.19185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853789.20009: done with get_vars() 30583 1726853789.20029: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 13:36:29 -0400 (0:00:00.098) 0:02:04.538 ****** 30583 1726853789.20086: entering _queue_task() for managed_node2/include_role 30583 1726853789.20365: worker is 1 (out of 1 available) 30583 1726853789.20381: exiting _queue_task() for managed_node2/include_role 30583 1726853789.20395: done queuing things up, now waiting for results queue to drain 30583 1726853789.20397: waiting for pending results... 30583 1726853789.20590: running TaskExecutor() for managed_node2/TASK: Include network role 30583 1726853789.20705: in run() - task 02083763-bbaf-05ea-abc5-000000002612 30583 1726853789.20716: variable 'ansible_search_path' from source: unknown 30583 1726853789.20721: variable 'ansible_search_path' from source: unknown 30583 1726853789.20752: calling self._execute() 30583 1726853789.20836: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853789.20842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853789.20852: variable 'omit' from source: magic vars 30583 1726853789.21140: variable 'ansible_distribution_major_version' from source: facts 30583 1726853789.21151: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853789.21156: _execute() done 30583 1726853789.21159: dumping result to json 30583 1726853789.21164: done dumping result, returning 30583 1726853789.21173: done running TaskExecutor() for managed_node2/TASK: Include network role [02083763-bbaf-05ea-abc5-000000002612] 30583 1726853789.21181: sending task result for task 02083763-bbaf-05ea-abc5-000000002612 30583 1726853789.21283: done sending task result for task 02083763-bbaf-05ea-abc5-000000002612 30583 1726853789.21287: WORKER PROCESS EXITING 30583 1726853789.21313: no more pending results, returning what we have 30583 1726853789.21318: in VariableManager get_vars() 30583 1726853789.21367: Calling all_inventory to load vars for managed_node2 30583 1726853789.21370: Calling groups_inventory to load vars for managed_node2 30583 1726853789.21379: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853789.21391: Calling all_plugins_play to load vars for managed_node2 30583 1726853789.21394: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853789.21396: Calling groups_plugins_play to load vars for managed_node2 30583 1726853789.22180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853789.23044: done with get_vars() 30583 1726853789.23058: variable 'ansible_search_path' from source: unknown 30583 1726853789.23059: variable 'ansible_search_path' from source: unknown 30583 1726853789.23149: variable 'omit' from source: magic vars 30583 1726853789.23178: variable 'omit' from source: magic vars 30583 1726853789.23188: variable 'omit' from source: magic vars 30583 1726853789.23190: we have included files to process 30583 1726853789.23191: generating all_blocks data 30583 1726853789.23192: done generating all_blocks data 30583 1726853789.23193: processing included file: fedora.linux_system_roles.network 30583 1726853789.23206: in VariableManager get_vars() 30583 1726853789.23217: done with get_vars() 30583 1726853789.23235: in VariableManager get_vars() 30583 1726853789.23247: done with get_vars() 30583 1726853789.23274: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30583 1726853789.23345: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30583 1726853789.23393: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30583 1726853789.23658: in VariableManager get_vars() 30583 1726853789.23675: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853789.24875: iterating over new_blocks loaded from include file 30583 1726853789.24877: in VariableManager get_vars() 30583 1726853789.24888: done with get_vars() 30583 1726853789.24889: filtering new block on tags 30583 1726853789.25087: done filtering new block on tags 30583 1726853789.25090: in VariableManager get_vars() 30583 1726853789.25101: done with get_vars() 30583 1726853789.25102: filtering new block on tags 30583 1726853789.25112: done filtering new block on tags 30583 1726853789.25114: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30583 1726853789.25117: extending task lists for all hosts with included blocks 30583 1726853789.25183: done extending task lists 30583 1726853789.25184: done processing included files 30583 1726853789.25184: results queue empty 30583 1726853789.25185: checking for any_errors_fatal 30583 1726853789.25187: done checking for any_errors_fatal 30583 1726853789.25188: checking for max_fail_percentage 30583 1726853789.25188: done checking for max_fail_percentage 30583 1726853789.25189: checking to see if all hosts have failed and the running result is not ok 30583 1726853789.25189: done checking to see if all hosts have failed 30583 1726853789.25190: getting the remaining hosts for this loop 30583 1726853789.25191: done getting the remaining hosts for this loop 30583 1726853789.25192: getting the next task for host managed_node2 30583 1726853789.25196: done getting next task for host managed_node2 30583 1726853789.25197: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853789.25199: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853789.25207: getting variables 30583 1726853789.25208: in VariableManager get_vars() 30583 1726853789.25217: Calling all_inventory to load vars for managed_node2 30583 1726853789.25218: Calling groups_inventory to load vars for managed_node2 30583 1726853789.25219: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853789.25223: Calling all_plugins_play to load vars for managed_node2 30583 1726853789.25224: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853789.25226: Calling groups_plugins_play to load vars for managed_node2 30583 1726853789.25848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853789.26685: done with get_vars() 30583 1726853789.26699: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:36:29 -0400 (0:00:00.066) 0:02:04.604 ****** 30583 1726853789.26746: entering _queue_task() for managed_node2/include_tasks 30583 1726853789.27018: worker is 1 (out of 1 available) 30583 1726853789.27034: exiting _queue_task() for managed_node2/include_tasks 30583 1726853789.27047: done queuing things up, now waiting for results queue to drain 30583 1726853789.27048: waiting for pending results... 30583 1726853789.27247: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30583 1726853789.27338: in run() - task 02083763-bbaf-05ea-abc5-000000002694 30583 1726853789.27350: variable 'ansible_search_path' from source: unknown 30583 1726853789.27353: variable 'ansible_search_path' from source: unknown 30583 1726853789.27387: calling self._execute() 30583 1726853789.27466: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853789.27470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853789.27481: variable 'omit' from source: magic vars 30583 1726853789.27778: variable 'ansible_distribution_major_version' from source: facts 30583 1726853789.27788: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853789.27794: _execute() done 30583 1726853789.27796: dumping result to json 30583 1726853789.27799: done dumping result, returning 30583 1726853789.27806: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-05ea-abc5-000000002694] 30583 1726853789.27810: sending task result for task 02083763-bbaf-05ea-abc5-000000002694 30583 1726853789.27897: done sending task result for task 02083763-bbaf-05ea-abc5-000000002694 30583 1726853789.27900: WORKER PROCESS EXITING 30583 1726853789.27974: no more pending results, returning what we have 30583 1726853789.27980: in VariableManager get_vars() 30583 1726853789.28031: Calling all_inventory to load vars for managed_node2 30583 1726853789.28034: Calling groups_inventory to load vars for managed_node2 30583 1726853789.28037: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853789.28047: Calling all_plugins_play to load vars for managed_node2 30583 1726853789.28050: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853789.28052: Calling groups_plugins_play to load vars for managed_node2 30583 1726853789.28917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853789.29780: done with get_vars() 30583 1726853789.29794: variable 'ansible_search_path' from source: unknown 30583 1726853789.29795: variable 'ansible_search_path' from source: unknown 30583 1726853789.29820: we have included files to process 30583 1726853789.29821: generating all_blocks data 30583 1726853789.29822: done generating all_blocks data 30583 1726853789.29824: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853789.29825: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853789.29826: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30583 1726853789.30191: done processing included file 30583 1726853789.30193: iterating over new_blocks loaded from include file 30583 1726853789.30194: in VariableManager get_vars() 30583 1726853789.30209: done with get_vars() 30583 1726853789.30210: filtering new block on tags 30583 1726853789.30230: done filtering new block on tags 30583 1726853789.30232: in VariableManager get_vars() 30583 1726853789.30247: done with get_vars() 30583 1726853789.30248: filtering new block on tags 30583 1726853789.30276: done filtering new block on tags 30583 1726853789.30278: in VariableManager get_vars() 30583 1726853789.30293: done with get_vars() 30583 1726853789.30294: filtering new block on tags 30583 1726853789.30316: done filtering new block on tags 30583 1726853789.30318: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30583 1726853789.30321: extending task lists for all hosts with included blocks 30583 1726853789.31260: done extending task lists 30583 1726853789.31261: done processing included files 30583 1726853789.31262: results queue empty 30583 1726853789.31262: checking for any_errors_fatal 30583 1726853789.31264: done checking for any_errors_fatal 30583 1726853789.31264: checking for max_fail_percentage 30583 1726853789.31265: done checking for max_fail_percentage 30583 1726853789.31266: checking to see if all hosts have failed and the running result is not ok 30583 1726853789.31266: done checking to see if all hosts have failed 30583 1726853789.31267: getting the remaining hosts for this loop 30583 1726853789.31268: done getting the remaining hosts for this loop 30583 1726853789.31270: getting the next task for host managed_node2 30583 1726853789.31274: done getting next task for host managed_node2 30583 1726853789.31276: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853789.31279: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853789.31286: getting variables 30583 1726853789.31287: in VariableManager get_vars() 30583 1726853789.31296: Calling all_inventory to load vars for managed_node2 30583 1726853789.31298: Calling groups_inventory to load vars for managed_node2 30583 1726853789.31299: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853789.31302: Calling all_plugins_play to load vars for managed_node2 30583 1726853789.31304: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853789.31306: Calling groups_plugins_play to load vars for managed_node2 30583 1726853789.31909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853789.32744: done with get_vars() 30583 1726853789.32758: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:36:29 -0400 (0:00:00.060) 0:02:04.665 ****** 30583 1726853789.32809: entering _queue_task() for managed_node2/setup 30583 1726853789.33075: worker is 1 (out of 1 available) 30583 1726853789.33088: exiting _queue_task() for managed_node2/setup 30583 1726853789.33102: done queuing things up, now waiting for results queue to drain 30583 1726853789.33104: waiting for pending results... 30583 1726853789.33295: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30583 1726853789.33398: in run() - task 02083763-bbaf-05ea-abc5-0000000026eb 30583 1726853789.33410: variable 'ansible_search_path' from source: unknown 30583 1726853789.33414: variable 'ansible_search_path' from source: unknown 30583 1726853789.33443: calling self._execute() 30583 1726853789.33517: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853789.33520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853789.33529: variable 'omit' from source: magic vars 30583 1726853789.33817: variable 'ansible_distribution_major_version' from source: facts 30583 1726853789.33826: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853789.33976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853789.35566: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853789.35614: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853789.35641: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853789.35668: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853789.35689: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853789.35746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853789.35769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853789.35789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853789.35814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853789.35825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853789.35866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853789.35885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853789.35900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853789.35925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853789.35935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853789.36045: variable '__network_required_facts' from source: role '' defaults 30583 1726853789.36058: variable 'ansible_facts' from source: unknown 30583 1726853789.36488: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30583 1726853789.36492: when evaluation is False, skipping this task 30583 1726853789.36495: _execute() done 30583 1726853789.36497: dumping result to json 30583 1726853789.36500: done dumping result, returning 30583 1726853789.36508: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-05ea-abc5-0000000026eb] 30583 1726853789.36510: sending task result for task 02083763-bbaf-05ea-abc5-0000000026eb 30583 1726853789.36592: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026eb 30583 1726853789.36596: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853789.36650: no more pending results, returning what we have 30583 1726853789.36654: results queue empty 30583 1726853789.36655: checking for any_errors_fatal 30583 1726853789.36656: done checking for any_errors_fatal 30583 1726853789.36657: checking for max_fail_percentage 30583 1726853789.36659: done checking for max_fail_percentage 30583 1726853789.36662: checking to see if all hosts have failed and the running result is not ok 30583 1726853789.36663: done checking to see if all hosts have failed 30583 1726853789.36664: getting the remaining hosts for this loop 30583 1726853789.36666: done getting the remaining hosts for this loop 30583 1726853789.36670: getting the next task for host managed_node2 30583 1726853789.36682: done getting next task for host managed_node2 30583 1726853789.36685: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853789.36691: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853789.36720: getting variables 30583 1726853789.36722: in VariableManager get_vars() 30583 1726853789.36767: Calling all_inventory to load vars for managed_node2 30583 1726853789.36770: Calling groups_inventory to load vars for managed_node2 30583 1726853789.36777: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853789.36786: Calling all_plugins_play to load vars for managed_node2 30583 1726853789.36788: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853789.36797: Calling groups_plugins_play to load vars for managed_node2 30583 1726853789.37666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853789.38540: done with get_vars() 30583 1726853789.38555: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:36:29 -0400 (0:00:00.058) 0:02:04.723 ****** 30583 1726853789.38626: entering _queue_task() for managed_node2/stat 30583 1726853789.38876: worker is 1 (out of 1 available) 30583 1726853789.38891: exiting _queue_task() for managed_node2/stat 30583 1726853789.38903: done queuing things up, now waiting for results queue to drain 30583 1726853789.38905: waiting for pending results... 30583 1726853789.39104: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30583 1726853789.39203: in run() - task 02083763-bbaf-05ea-abc5-0000000026ed 30583 1726853789.39214: variable 'ansible_search_path' from source: unknown 30583 1726853789.39218: variable 'ansible_search_path' from source: unknown 30583 1726853789.39248: calling self._execute() 30583 1726853789.39330: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853789.39334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853789.39344: variable 'omit' from source: magic vars 30583 1726853789.39640: variable 'ansible_distribution_major_version' from source: facts 30583 1726853789.39649: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853789.39766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853789.39977: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853789.40001: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853789.40027: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853789.40053: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853789.40122: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853789.40140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853789.40160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853789.40178: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853789.40250: variable '__network_is_ostree' from source: set_fact 30583 1726853789.40254: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853789.40260: when evaluation is False, skipping this task 30583 1726853789.40263: _execute() done 30583 1726853789.40265: dumping result to json 30583 1726853789.40267: done dumping result, returning 30583 1726853789.40274: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-05ea-abc5-0000000026ed] 30583 1726853789.40280: sending task result for task 02083763-bbaf-05ea-abc5-0000000026ed 30583 1726853789.40365: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026ed 30583 1726853789.40368: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853789.40420: no more pending results, returning what we have 30583 1726853789.40424: results queue empty 30583 1726853789.40425: checking for any_errors_fatal 30583 1726853789.40435: done checking for any_errors_fatal 30583 1726853789.40436: checking for max_fail_percentage 30583 1726853789.40437: done checking for max_fail_percentage 30583 1726853789.40438: checking to see if all hosts have failed and the running result is not ok 30583 1726853789.40439: done checking to see if all hosts have failed 30583 1726853789.40440: getting the remaining hosts for this loop 30583 1726853789.40442: done getting the remaining hosts for this loop 30583 1726853789.40446: getting the next task for host managed_node2 30583 1726853789.40453: done getting next task for host managed_node2 30583 1726853789.40457: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853789.40464: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853789.40492: getting variables 30583 1726853789.40493: in VariableManager get_vars() 30583 1726853789.40534: Calling all_inventory to load vars for managed_node2 30583 1726853789.40537: Calling groups_inventory to load vars for managed_node2 30583 1726853789.40539: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853789.40547: Calling all_plugins_play to load vars for managed_node2 30583 1726853789.40550: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853789.40552: Calling groups_plugins_play to load vars for managed_node2 30583 1726853789.41327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853789.42321: done with get_vars() 30583 1726853789.42336: done getting variables 30583 1726853789.42384: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:36:29 -0400 (0:00:00.037) 0:02:04.761 ****** 30583 1726853789.42412: entering _queue_task() for managed_node2/set_fact 30583 1726853789.42668: worker is 1 (out of 1 available) 30583 1726853789.42683: exiting _queue_task() for managed_node2/set_fact 30583 1726853789.42697: done queuing things up, now waiting for results queue to drain 30583 1726853789.42699: waiting for pending results... 30583 1726853789.42885: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30583 1726853789.42982: in run() - task 02083763-bbaf-05ea-abc5-0000000026ee 30583 1726853789.42994: variable 'ansible_search_path' from source: unknown 30583 1726853789.42998: variable 'ansible_search_path' from source: unknown 30583 1726853789.43026: calling self._execute() 30583 1726853789.43107: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853789.43111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853789.43121: variable 'omit' from source: magic vars 30583 1726853789.43405: variable 'ansible_distribution_major_version' from source: facts 30583 1726853789.43414: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853789.43532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853789.43730: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853789.43765: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853789.43793: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853789.43822: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853789.43886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853789.43904: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853789.43923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853789.43940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853789.44007: variable '__network_is_ostree' from source: set_fact 30583 1726853789.44015: Evaluated conditional (not __network_is_ostree is defined): False 30583 1726853789.44018: when evaluation is False, skipping this task 30583 1726853789.44020: _execute() done 30583 1726853789.44023: dumping result to json 30583 1726853789.44025: done dumping result, returning 30583 1726853789.44034: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-05ea-abc5-0000000026ee] 30583 1726853789.44036: sending task result for task 02083763-bbaf-05ea-abc5-0000000026ee 30583 1726853789.44120: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026ee 30583 1726853789.44123: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30583 1726853789.44182: no more pending results, returning what we have 30583 1726853789.44186: results queue empty 30583 1726853789.44187: checking for any_errors_fatal 30583 1726853789.44193: done checking for any_errors_fatal 30583 1726853789.44194: checking for max_fail_percentage 30583 1726853789.44196: done checking for max_fail_percentage 30583 1726853789.44197: checking to see if all hosts have failed and the running result is not ok 30583 1726853789.44198: done checking to see if all hosts have failed 30583 1726853789.44198: getting the remaining hosts for this loop 30583 1726853789.44200: done getting the remaining hosts for this loop 30583 1726853789.44204: getting the next task for host managed_node2 30583 1726853789.44215: done getting next task for host managed_node2 30583 1726853789.44219: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853789.44224: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853789.44248: getting variables 30583 1726853789.44250: in VariableManager get_vars() 30583 1726853789.44293: Calling all_inventory to load vars for managed_node2 30583 1726853789.44296: Calling groups_inventory to load vars for managed_node2 30583 1726853789.44298: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853789.44306: Calling all_plugins_play to load vars for managed_node2 30583 1726853789.44308: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853789.44311: Calling groups_plugins_play to load vars for managed_node2 30583 1726853789.45078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853789.45945: done with get_vars() 30583 1726853789.45962: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:36:29 -0400 (0:00:00.036) 0:02:04.797 ****** 30583 1726853789.46034: entering _queue_task() for managed_node2/service_facts 30583 1726853789.46278: worker is 1 (out of 1 available) 30583 1726853789.46292: exiting _queue_task() for managed_node2/service_facts 30583 1726853789.46307: done queuing things up, now waiting for results queue to drain 30583 1726853789.46308: waiting for pending results... 30583 1726853789.46507: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30583 1726853789.46597: in run() - task 02083763-bbaf-05ea-abc5-0000000026f0 30583 1726853789.46608: variable 'ansible_search_path' from source: unknown 30583 1726853789.46612: variable 'ansible_search_path' from source: unknown 30583 1726853789.46645: calling self._execute() 30583 1726853789.46723: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853789.46727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853789.46735: variable 'omit' from source: magic vars 30583 1726853789.47029: variable 'ansible_distribution_major_version' from source: facts 30583 1726853789.47039: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853789.47045: variable 'omit' from source: magic vars 30583 1726853789.47105: variable 'omit' from source: magic vars 30583 1726853789.47128: variable 'omit' from source: magic vars 30583 1726853789.47160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853789.47193: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853789.47209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853789.47222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853789.47232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853789.47257: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853789.47263: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853789.47266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853789.47337: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853789.47341: Set connection var ansible_timeout to 10 30583 1726853789.47344: Set connection var ansible_connection to ssh 30583 1726853789.47349: Set connection var ansible_shell_executable to /bin/sh 30583 1726853789.47352: Set connection var ansible_shell_type to sh 30583 1726853789.47360: Set connection var ansible_pipelining to False 30583 1726853789.47382: variable 'ansible_shell_executable' from source: unknown 30583 1726853789.47385: variable 'ansible_connection' from source: unknown 30583 1726853789.47388: variable 'ansible_module_compression' from source: unknown 30583 1726853789.47390: variable 'ansible_shell_type' from source: unknown 30583 1726853789.47393: variable 'ansible_shell_executable' from source: unknown 30583 1726853789.47395: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853789.47399: variable 'ansible_pipelining' from source: unknown 30583 1726853789.47401: variable 'ansible_timeout' from source: unknown 30583 1726853789.47403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853789.47547: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853789.47556: variable 'omit' from source: magic vars 30583 1726853789.47563: starting attempt loop 30583 1726853789.47566: running the handler 30583 1726853789.47580: _low_level_execute_command(): starting 30583 1726853789.47587: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853789.48072: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853789.48096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853789.48101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853789.48155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853789.48161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853789.48166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853789.48244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853789.49965: stdout chunk (state=3): >>>/root <<< 30583 1726853789.50066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853789.50097: stderr chunk (state=3): >>><<< 30583 1726853789.50101: stdout chunk (state=3): >>><<< 30583 1726853789.50121: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853789.50132: _low_level_execute_command(): starting 30583 1726853789.50138: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784 `" && echo ansible-tmp-1726853789.5012102-35985-62692684058784="` echo /root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784 `" ) && sleep 0' 30583 1726853789.50560: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853789.50563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853789.50566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853789.50577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853789.50626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853789.50632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853789.50635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853789.50703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853789.52706: stdout chunk (state=3): >>>ansible-tmp-1726853789.5012102-35985-62692684058784=/root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784 <<< 30583 1726853789.52814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853789.52841: stderr chunk (state=3): >>><<< 30583 1726853789.52844: stdout chunk (state=3): >>><<< 30583 1726853789.52861: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853789.5012102-35985-62692684058784=/root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853789.52897: variable 'ansible_module_compression' from source: unknown 30583 1726853789.52930: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30583 1726853789.52976: variable 'ansible_facts' from source: unknown 30583 1726853789.53020: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784/AnsiballZ_service_facts.py 30583 1726853789.53119: Sending initial data 30583 1726853789.53122: Sent initial data (161 bytes) 30583 1726853789.53553: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853789.53556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853789.53561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853789.53563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853789.53565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853789.53618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853789.53624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853789.53695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853789.55388: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30583 1726853789.55393: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853789.55455: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853789.55524: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpz5tge8zm /root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784/AnsiballZ_service_facts.py <<< 30583 1726853789.55527: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784/AnsiballZ_service_facts.py" <<< 30583 1726853789.55592: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpz5tge8zm" to remote "/root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784/AnsiballZ_service_facts.py" <<< 30583 1726853789.55597: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784/AnsiballZ_service_facts.py" <<< 30583 1726853789.56251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853789.56294: stderr chunk (state=3): >>><<< 30583 1726853789.56297: stdout chunk (state=3): >>><<< 30583 1726853789.56344: done transferring module to remote 30583 1726853789.56352: _low_level_execute_command(): starting 30583 1726853789.56357: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784/ /root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784/AnsiballZ_service_facts.py && sleep 0' 30583 1726853789.56799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853789.56802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853789.56804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853789.56807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853789.56808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853789.56814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853789.56858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853789.56862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853789.56946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853789.58804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853789.58827: stderr chunk (state=3): >>><<< 30583 1726853789.58830: stdout chunk (state=3): >>><<< 30583 1726853789.58841: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853789.58844: _low_level_execute_command(): starting 30583 1726853789.58849: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784/AnsiballZ_service_facts.py && sleep 0' 30583 1726853789.59276: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853789.59279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853789.59282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853789.59284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853789.59287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853789.59325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853789.59337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853789.59414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853791.23184: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static<<< 30583 1726853791.23204: stdout chunk (state=3): >>>", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.servic<<< 30583 1726853791.23218: stdout chunk (state=3): >>>e": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30583 1726853791.24807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853791.24835: stderr chunk (state=3): >>><<< 30583 1726853791.24838: stdout chunk (state=3): >>><<< 30583 1726853791.24872: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853791.25592: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853791.25601: _low_level_execute_command(): starting 30583 1726853791.25606: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853789.5012102-35985-62692684058784/ > /dev/null 2>&1 && sleep 0' 30583 1726853791.26061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853791.26065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853791.26067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853791.26069: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853791.26074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853791.26123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853791.26127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853791.26132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853791.26203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853791.28126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853791.28153: stderr chunk (state=3): >>><<< 30583 1726853791.28156: stdout chunk (state=3): >>><<< 30583 1726853791.28168: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853791.28176: handler run complete 30583 1726853791.28293: variable 'ansible_facts' from source: unknown 30583 1726853791.28390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853791.28676: variable 'ansible_facts' from source: unknown 30583 1726853791.28753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853791.28869: attempt loop complete, returning result 30583 1726853791.28874: _execute() done 30583 1726853791.28876: dumping result to json 30583 1726853791.28914: done dumping result, returning 30583 1726853791.28923: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-05ea-abc5-0000000026f0] 30583 1726853791.28927: sending task result for task 02083763-bbaf-05ea-abc5-0000000026f0 30583 1726853791.29706: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026f0 30583 1726853791.29709: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853791.29763: no more pending results, returning what we have 30583 1726853791.29765: results queue empty 30583 1726853791.29766: checking for any_errors_fatal 30583 1726853791.29768: done checking for any_errors_fatal 30583 1726853791.29769: checking for max_fail_percentage 30583 1726853791.29770: done checking for max_fail_percentage 30583 1726853791.29773: checking to see if all hosts have failed and the running result is not ok 30583 1726853791.29773: done checking to see if all hosts have failed 30583 1726853791.29774: getting the remaining hosts for this loop 30583 1726853791.29775: done getting the remaining hosts for this loop 30583 1726853791.29777: getting the next task for host managed_node2 30583 1726853791.29781: done getting next task for host managed_node2 30583 1726853791.29783: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853791.29788: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853791.29796: getting variables 30583 1726853791.29797: in VariableManager get_vars() 30583 1726853791.29823: Calling all_inventory to load vars for managed_node2 30583 1726853791.29825: Calling groups_inventory to load vars for managed_node2 30583 1726853791.29826: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853791.29832: Calling all_plugins_play to load vars for managed_node2 30583 1726853791.29835: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853791.29841: Calling groups_plugins_play to load vars for managed_node2 30583 1726853791.30512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853791.31378: done with get_vars() 30583 1726853791.31395: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:36:31 -0400 (0:00:01.854) 0:02:06.651 ****** 30583 1726853791.31469: entering _queue_task() for managed_node2/package_facts 30583 1726853791.31726: worker is 1 (out of 1 available) 30583 1726853791.31740: exiting _queue_task() for managed_node2/package_facts 30583 1726853791.31754: done queuing things up, now waiting for results queue to drain 30583 1726853791.31755: waiting for pending results... 30583 1726853791.31952: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30583 1726853791.32058: in run() - task 02083763-bbaf-05ea-abc5-0000000026f1 30583 1726853791.32073: variable 'ansible_search_path' from source: unknown 30583 1726853791.32078: variable 'ansible_search_path' from source: unknown 30583 1726853791.32109: calling self._execute() 30583 1726853791.32196: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853791.32200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853791.32206: variable 'omit' from source: magic vars 30583 1726853791.32495: variable 'ansible_distribution_major_version' from source: facts 30583 1726853791.32504: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853791.32509: variable 'omit' from source: magic vars 30583 1726853791.32561: variable 'omit' from source: magic vars 30583 1726853791.32587: variable 'omit' from source: magic vars 30583 1726853791.32620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853791.32649: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853791.32669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853791.32684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853791.32693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853791.32717: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853791.32720: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853791.32723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853791.32796: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853791.32801: Set connection var ansible_timeout to 10 30583 1726853791.32804: Set connection var ansible_connection to ssh 30583 1726853791.32809: Set connection var ansible_shell_executable to /bin/sh 30583 1726853791.32811: Set connection var ansible_shell_type to sh 30583 1726853791.32819: Set connection var ansible_pipelining to False 30583 1726853791.32836: variable 'ansible_shell_executable' from source: unknown 30583 1726853791.32841: variable 'ansible_connection' from source: unknown 30583 1726853791.32843: variable 'ansible_module_compression' from source: unknown 30583 1726853791.32846: variable 'ansible_shell_type' from source: unknown 30583 1726853791.32848: variable 'ansible_shell_executable' from source: unknown 30583 1726853791.32851: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853791.32853: variable 'ansible_pipelining' from source: unknown 30583 1726853791.32855: variable 'ansible_timeout' from source: unknown 30583 1726853791.32857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853791.33003: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853791.33013: variable 'omit' from source: magic vars 30583 1726853791.33018: starting attempt loop 30583 1726853791.33021: running the handler 30583 1726853791.33033: _low_level_execute_command(): starting 30583 1726853791.33040: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853791.33556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853791.33564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853791.33567: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853791.33570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853791.33612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853791.33615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853791.33617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853791.33701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853791.35457: stdout chunk (state=3): >>>/root <<< 30583 1726853791.35553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853791.35587: stderr chunk (state=3): >>><<< 30583 1726853791.35593: stdout chunk (state=3): >>><<< 30583 1726853791.35615: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853791.35625: _low_level_execute_command(): starting 30583 1726853791.35631: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039 `" && echo ansible-tmp-1726853791.356137-35999-114357963357039="` echo /root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039 `" ) && sleep 0' 30583 1726853791.36082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853791.36085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853791.36088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853791.36098: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853791.36100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853791.36102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853791.36137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853791.36156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853791.36221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853791.38266: stdout chunk (state=3): >>>ansible-tmp-1726853791.356137-35999-114357963357039=/root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039 <<< 30583 1726853791.38375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853791.38404: stderr chunk (state=3): >>><<< 30583 1726853791.38407: stdout chunk (state=3): >>><<< 30583 1726853791.38421: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853791.356137-35999-114357963357039=/root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853791.38465: variable 'ansible_module_compression' from source: unknown 30583 1726853791.38509: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30583 1726853791.38560: variable 'ansible_facts' from source: unknown 30583 1726853791.38683: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039/AnsiballZ_package_facts.py 30583 1726853791.38789: Sending initial data 30583 1726853791.38793: Sent initial data (161 bytes) 30583 1726853791.39240: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853791.39243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853791.39245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853791.39249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853791.39252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853791.39307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853791.39310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853791.39315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853791.39389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853791.41105: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30583 1726853791.41111: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853791.41178: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853791.41244: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpefa4h245 /root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039/AnsiballZ_package_facts.py <<< 30583 1726853791.41250: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039/AnsiballZ_package_facts.py" <<< 30583 1726853791.41314: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpefa4h245" to remote "/root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039/AnsiballZ_package_facts.py" <<< 30583 1726853791.41317: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039/AnsiballZ_package_facts.py" <<< 30583 1726853791.42515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853791.42556: stderr chunk (state=3): >>><<< 30583 1726853791.42562: stdout chunk (state=3): >>><<< 30583 1726853791.42601: done transferring module to remote 30583 1726853791.42610: _low_level_execute_command(): starting 30583 1726853791.42615: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039/ /root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039/AnsiballZ_package_facts.py && sleep 0' 30583 1726853791.43068: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853791.43074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853791.43076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853791.43082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853791.43084: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853791.43131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853791.43136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853791.43138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853791.43204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853791.45133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853791.45157: stderr chunk (state=3): >>><<< 30583 1726853791.45160: stdout chunk (state=3): >>><<< 30583 1726853791.45176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853791.45180: _low_level_execute_command(): starting 30583 1726853791.45185: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039/AnsiballZ_package_facts.py && sleep 0' 30583 1726853791.45617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853791.45620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853791.45623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853791.45625: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853791.45627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853791.45677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853791.45685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853791.45764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853791.90918: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30583 1726853791.90936: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30583 1726853791.90991: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30583 1726853791.90999: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30583 1726853791.91040: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30583 1726853791.91052: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30583 1726853791.91059: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30583 1726853791.91086: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30583 1726853791.91094: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30583 1726853791.92953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853791.92983: stderr chunk (state=3): >>><<< 30583 1726853791.92991: stdout chunk (state=3): >>><<< 30583 1726853791.93028: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853791.94388: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853791.94407: _low_level_execute_command(): starting 30583 1726853791.94410: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853791.356137-35999-114357963357039/ > /dev/null 2>&1 && sleep 0' 30583 1726853791.94863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853791.94866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853791.94868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853791.94875: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853791.94878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853791.94924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853791.94928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853791.94935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853791.95006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853791.96922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853791.96948: stderr chunk (state=3): >>><<< 30583 1726853791.96951: stdout chunk (state=3): >>><<< 30583 1726853791.96964: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853791.96972: handler run complete 30583 1726853791.97437: variable 'ansible_facts' from source: unknown 30583 1726853791.97775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853791.98825: variable 'ansible_facts' from source: unknown 30583 1726853791.99072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853791.99449: attempt loop complete, returning result 30583 1726853791.99461: _execute() done 30583 1726853791.99464: dumping result to json 30583 1726853791.99578: done dumping result, returning 30583 1726853791.99587: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-05ea-abc5-0000000026f1] 30583 1726853791.99591: sending task result for task 02083763-bbaf-05ea-abc5-0000000026f1 30583 1726853792.00903: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026f1 30583 1726853792.00907: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853792.01006: no more pending results, returning what we have 30583 1726853792.01008: results queue empty 30583 1726853792.01009: checking for any_errors_fatal 30583 1726853792.01012: done checking for any_errors_fatal 30583 1726853792.01013: checking for max_fail_percentage 30583 1726853792.01014: done checking for max_fail_percentage 30583 1726853792.01015: checking to see if all hosts have failed and the running result is not ok 30583 1726853792.01016: done checking to see if all hosts have failed 30583 1726853792.01016: getting the remaining hosts for this loop 30583 1726853792.01018: done getting the remaining hosts for this loop 30583 1726853792.01021: getting the next task for host managed_node2 30583 1726853792.01026: done getting next task for host managed_node2 30583 1726853792.01028: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853792.01032: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853792.01041: getting variables 30583 1726853792.01042: in VariableManager get_vars() 30583 1726853792.01068: Calling all_inventory to load vars for managed_node2 30583 1726853792.01072: Calling groups_inventory to load vars for managed_node2 30583 1726853792.01074: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853792.01080: Calling all_plugins_play to load vars for managed_node2 30583 1726853792.01082: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853792.01083: Calling groups_plugins_play to load vars for managed_node2 30583 1726853792.01768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853792.02643: done with get_vars() 30583 1726853792.02668: done getting variables 30583 1726853792.02714: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:36:32 -0400 (0:00:00.712) 0:02:07.364 ****** 30583 1726853792.02740: entering _queue_task() for managed_node2/debug 30583 1726853792.03008: worker is 1 (out of 1 available) 30583 1726853792.03023: exiting _queue_task() for managed_node2/debug 30583 1726853792.03037: done queuing things up, now waiting for results queue to drain 30583 1726853792.03038: waiting for pending results... 30583 1726853792.03231: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30583 1726853792.03329: in run() - task 02083763-bbaf-05ea-abc5-000000002695 30583 1726853792.03341: variable 'ansible_search_path' from source: unknown 30583 1726853792.03345: variable 'ansible_search_path' from source: unknown 30583 1726853792.03377: calling self._execute() 30583 1726853792.03453: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.03456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.03465: variable 'omit' from source: magic vars 30583 1726853792.03764: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.03773: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853792.03779: variable 'omit' from source: magic vars 30583 1726853792.03824: variable 'omit' from source: magic vars 30583 1726853792.03893: variable 'network_provider' from source: set_fact 30583 1726853792.03907: variable 'omit' from source: magic vars 30583 1726853792.03942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853792.03970: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853792.03987: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853792.04001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853792.04011: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853792.04037: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853792.04040: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.04043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.04112: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853792.04117: Set connection var ansible_timeout to 10 30583 1726853792.04120: Set connection var ansible_connection to ssh 30583 1726853792.04125: Set connection var ansible_shell_executable to /bin/sh 30583 1726853792.04129: Set connection var ansible_shell_type to sh 30583 1726853792.04139: Set connection var ansible_pipelining to False 30583 1726853792.04156: variable 'ansible_shell_executable' from source: unknown 30583 1726853792.04162: variable 'ansible_connection' from source: unknown 30583 1726853792.04164: variable 'ansible_module_compression' from source: unknown 30583 1726853792.04167: variable 'ansible_shell_type' from source: unknown 30583 1726853792.04169: variable 'ansible_shell_executable' from source: unknown 30583 1726853792.04172: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.04175: variable 'ansible_pipelining' from source: unknown 30583 1726853792.04177: variable 'ansible_timeout' from source: unknown 30583 1726853792.04179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.04280: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853792.04289: variable 'omit' from source: magic vars 30583 1726853792.04294: starting attempt loop 30583 1726853792.04297: running the handler 30583 1726853792.04333: handler run complete 30583 1726853792.04343: attempt loop complete, returning result 30583 1726853792.04347: _execute() done 30583 1726853792.04350: dumping result to json 30583 1726853792.04352: done dumping result, returning 30583 1726853792.04362: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-05ea-abc5-000000002695] 30583 1726853792.04365: sending task result for task 02083763-bbaf-05ea-abc5-000000002695 30583 1726853792.04444: done sending task result for task 02083763-bbaf-05ea-abc5-000000002695 30583 1726853792.04447: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30583 1726853792.04535: no more pending results, returning what we have 30583 1726853792.04539: results queue empty 30583 1726853792.04540: checking for any_errors_fatal 30583 1726853792.04552: done checking for any_errors_fatal 30583 1726853792.04553: checking for max_fail_percentage 30583 1726853792.04554: done checking for max_fail_percentage 30583 1726853792.04555: checking to see if all hosts have failed and the running result is not ok 30583 1726853792.04556: done checking to see if all hosts have failed 30583 1726853792.04557: getting the remaining hosts for this loop 30583 1726853792.04561: done getting the remaining hosts for this loop 30583 1726853792.04564: getting the next task for host managed_node2 30583 1726853792.04578: done getting next task for host managed_node2 30583 1726853792.04582: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853792.04586: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853792.04599: getting variables 30583 1726853792.04601: in VariableManager get_vars() 30583 1726853792.04641: Calling all_inventory to load vars for managed_node2 30583 1726853792.04643: Calling groups_inventory to load vars for managed_node2 30583 1726853792.04645: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853792.04653: Calling all_plugins_play to load vars for managed_node2 30583 1726853792.04655: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853792.04660: Calling groups_plugins_play to load vars for managed_node2 30583 1726853792.05563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853792.06427: done with get_vars() 30583 1726853792.06444: done getting variables 30583 1726853792.06489: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:36:32 -0400 (0:00:00.037) 0:02:07.402 ****** 30583 1726853792.06518: entering _queue_task() for managed_node2/fail 30583 1726853792.06754: worker is 1 (out of 1 available) 30583 1726853792.06772: exiting _queue_task() for managed_node2/fail 30583 1726853792.06785: done queuing things up, now waiting for results queue to drain 30583 1726853792.06787: waiting for pending results... 30583 1726853792.06967: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30583 1726853792.07065: in run() - task 02083763-bbaf-05ea-abc5-000000002696 30583 1726853792.07079: variable 'ansible_search_path' from source: unknown 30583 1726853792.07083: variable 'ansible_search_path' from source: unknown 30583 1726853792.07112: calling self._execute() 30583 1726853792.07188: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.07192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.07200: variable 'omit' from source: magic vars 30583 1726853792.07481: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.07490: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853792.07569: variable 'network_state' from source: role '' defaults 30583 1726853792.07579: Evaluated conditional (network_state != {}): False 30583 1726853792.07582: when evaluation is False, skipping this task 30583 1726853792.07585: _execute() done 30583 1726853792.07588: dumping result to json 30583 1726853792.07591: done dumping result, returning 30583 1726853792.07598: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-05ea-abc5-000000002696] 30583 1726853792.07601: sending task result for task 02083763-bbaf-05ea-abc5-000000002696 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853792.07739: no more pending results, returning what we have 30583 1726853792.07743: results queue empty 30583 1726853792.07744: checking for any_errors_fatal 30583 1726853792.07751: done checking for any_errors_fatal 30583 1726853792.07752: checking for max_fail_percentage 30583 1726853792.07754: done checking for max_fail_percentage 30583 1726853792.07754: checking to see if all hosts have failed and the running result is not ok 30583 1726853792.07755: done checking to see if all hosts have failed 30583 1726853792.07756: getting the remaining hosts for this loop 30583 1726853792.07760: done getting the remaining hosts for this loop 30583 1726853792.07764: getting the next task for host managed_node2 30583 1726853792.07773: done getting next task for host managed_node2 30583 1726853792.07776: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853792.07782: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853792.07808: getting variables 30583 1726853792.07810: in VariableManager get_vars() 30583 1726853792.07846: Calling all_inventory to load vars for managed_node2 30583 1726853792.07849: Calling groups_inventory to load vars for managed_node2 30583 1726853792.07851: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853792.07861: Calling all_plugins_play to load vars for managed_node2 30583 1726853792.07863: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853792.07866: Calling groups_plugins_play to load vars for managed_node2 30583 1726853792.07880: done sending task result for task 02083763-bbaf-05ea-abc5-000000002696 30583 1726853792.07883: WORKER PROCESS EXITING 30583 1726853792.08625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853792.09491: done with get_vars() 30583 1726853792.09508: done getting variables 30583 1726853792.09547: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:36:32 -0400 (0:00:00.030) 0:02:07.432 ****** 30583 1726853792.09575: entering _queue_task() for managed_node2/fail 30583 1726853792.09804: worker is 1 (out of 1 available) 30583 1726853792.09820: exiting _queue_task() for managed_node2/fail 30583 1726853792.09834: done queuing things up, now waiting for results queue to drain 30583 1726853792.09836: waiting for pending results... 30583 1726853792.10023: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30583 1726853792.10112: in run() - task 02083763-bbaf-05ea-abc5-000000002697 30583 1726853792.10122: variable 'ansible_search_path' from source: unknown 30583 1726853792.10126: variable 'ansible_search_path' from source: unknown 30583 1726853792.10155: calling self._execute() 30583 1726853792.10233: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.10237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.10245: variable 'omit' from source: magic vars 30583 1726853792.10532: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.10541: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853792.10627: variable 'network_state' from source: role '' defaults 30583 1726853792.10637: Evaluated conditional (network_state != {}): False 30583 1726853792.10640: when evaluation is False, skipping this task 30583 1726853792.10643: _execute() done 30583 1726853792.10645: dumping result to json 30583 1726853792.10648: done dumping result, returning 30583 1726853792.10656: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-05ea-abc5-000000002697] 30583 1726853792.10661: sending task result for task 02083763-bbaf-05ea-abc5-000000002697 30583 1726853792.10748: done sending task result for task 02083763-bbaf-05ea-abc5-000000002697 30583 1726853792.10752: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853792.10804: no more pending results, returning what we have 30583 1726853792.10808: results queue empty 30583 1726853792.10809: checking for any_errors_fatal 30583 1726853792.10818: done checking for any_errors_fatal 30583 1726853792.10819: checking for max_fail_percentage 30583 1726853792.10820: done checking for max_fail_percentage 30583 1726853792.10821: checking to see if all hosts have failed and the running result is not ok 30583 1726853792.10822: done checking to see if all hosts have failed 30583 1726853792.10822: getting the remaining hosts for this loop 30583 1726853792.10824: done getting the remaining hosts for this loop 30583 1726853792.10828: getting the next task for host managed_node2 30583 1726853792.10836: done getting next task for host managed_node2 30583 1726853792.10839: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853792.10843: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853792.10870: getting variables 30583 1726853792.10874: in VariableManager get_vars() 30583 1726853792.10910: Calling all_inventory to load vars for managed_node2 30583 1726853792.10912: Calling groups_inventory to load vars for managed_node2 30583 1726853792.10914: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853792.10922: Calling all_plugins_play to load vars for managed_node2 30583 1726853792.10924: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853792.10927: Calling groups_plugins_play to load vars for managed_node2 30583 1726853792.11832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853792.12689: done with get_vars() 30583 1726853792.12707: done getting variables 30583 1726853792.12748: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:36:32 -0400 (0:00:00.031) 0:02:07.464 ****** 30583 1726853792.12776: entering _queue_task() for managed_node2/fail 30583 1726853792.13018: worker is 1 (out of 1 available) 30583 1726853792.13033: exiting _queue_task() for managed_node2/fail 30583 1726853792.13046: done queuing things up, now waiting for results queue to drain 30583 1726853792.13048: waiting for pending results... 30583 1726853792.13241: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30583 1726853792.13338: in run() - task 02083763-bbaf-05ea-abc5-000000002698 30583 1726853792.13348: variable 'ansible_search_path' from source: unknown 30583 1726853792.13352: variable 'ansible_search_path' from source: unknown 30583 1726853792.13389: calling self._execute() 30583 1726853792.13470: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.13477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.13485: variable 'omit' from source: magic vars 30583 1726853792.13776: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.13785: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853792.13918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853792.15469: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853792.15524: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853792.15552: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853792.15584: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853792.15605: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853792.15660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.15687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.15705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.15730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.15741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.15814: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.15828: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30583 1726853792.15911: variable 'ansible_distribution' from source: facts 30583 1726853792.15914: variable '__network_rh_distros' from source: role '' defaults 30583 1726853792.15922: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30583 1726853792.16087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.16107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.16124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.16149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.16159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.16195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.16215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.16230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.16254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.16268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.16297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.16313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.16331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.16355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.16368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.16566: variable 'network_connections' from source: include params 30583 1726853792.16576: variable 'interface' from source: play vars 30583 1726853792.16620: variable 'interface' from source: play vars 30583 1726853792.16628: variable 'network_state' from source: role '' defaults 30583 1726853792.16681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853792.16792: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853792.16819: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853792.16841: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853792.16862: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853792.16906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853792.16921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853792.16942: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.16959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853792.16983: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30583 1726853792.16987: when evaluation is False, skipping this task 30583 1726853792.16989: _execute() done 30583 1726853792.16992: dumping result to json 30583 1726853792.16995: done dumping result, returning 30583 1726853792.17003: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-05ea-abc5-000000002698] 30583 1726853792.17005: sending task result for task 02083763-bbaf-05ea-abc5-000000002698 30583 1726853792.17091: done sending task result for task 02083763-bbaf-05ea-abc5-000000002698 30583 1726853792.17094: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30583 1726853792.17138: no more pending results, returning what we have 30583 1726853792.17142: results queue empty 30583 1726853792.17143: checking for any_errors_fatal 30583 1726853792.17151: done checking for any_errors_fatal 30583 1726853792.17152: checking for max_fail_percentage 30583 1726853792.17154: done checking for max_fail_percentage 30583 1726853792.17155: checking to see if all hosts have failed and the running result is not ok 30583 1726853792.17156: done checking to see if all hosts have failed 30583 1726853792.17156: getting the remaining hosts for this loop 30583 1726853792.17158: done getting the remaining hosts for this loop 30583 1726853792.17162: getting the next task for host managed_node2 30583 1726853792.17170: done getting next task for host managed_node2 30583 1726853792.17176: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853792.17181: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853792.17213: getting variables 30583 1726853792.17214: in VariableManager get_vars() 30583 1726853792.17262: Calling all_inventory to load vars for managed_node2 30583 1726853792.17265: Calling groups_inventory to load vars for managed_node2 30583 1726853792.17267: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853792.17280: Calling all_plugins_play to load vars for managed_node2 30583 1726853792.17283: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853792.17286: Calling groups_plugins_play to load vars for managed_node2 30583 1726853792.18113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853792.18972: done with get_vars() 30583 1726853792.18990: done getting variables 30583 1726853792.19031: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:36:32 -0400 (0:00:00.062) 0:02:07.527 ****** 30583 1726853792.19055: entering _queue_task() for managed_node2/dnf 30583 1726853792.19305: worker is 1 (out of 1 available) 30583 1726853792.19321: exiting _queue_task() for managed_node2/dnf 30583 1726853792.19334: done queuing things up, now waiting for results queue to drain 30583 1726853792.19335: waiting for pending results... 30583 1726853792.19538: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30583 1726853792.19638: in run() - task 02083763-bbaf-05ea-abc5-000000002699 30583 1726853792.19650: variable 'ansible_search_path' from source: unknown 30583 1726853792.19654: variable 'ansible_search_path' from source: unknown 30583 1726853792.19687: calling self._execute() 30583 1726853792.19764: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.19768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.19777: variable 'omit' from source: magic vars 30583 1726853792.20072: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.20081: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853792.20218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853792.21960: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853792.22005: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853792.22042: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853792.22070: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853792.22093: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853792.22150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.22174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.22192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.22218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.22228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.22309: variable 'ansible_distribution' from source: facts 30583 1726853792.22312: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.22325: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30583 1726853792.22405: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853792.22486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.22504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.22521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.22545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.22556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.22585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.22605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.22619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.22643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.22653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.22681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.22697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.22715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.22739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.22749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.22846: variable 'network_connections' from source: include params 30583 1726853792.22857: variable 'interface' from source: play vars 30583 1726853792.22902: variable 'interface' from source: play vars 30583 1726853792.22951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853792.23064: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853792.23091: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853792.23113: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853792.23133: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853792.23165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853792.23183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853792.23203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.23220: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853792.23261: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853792.23409: variable 'network_connections' from source: include params 30583 1726853792.23413: variable 'interface' from source: play vars 30583 1726853792.23455: variable 'interface' from source: play vars 30583 1726853792.23477: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853792.23480: when evaluation is False, skipping this task 30583 1726853792.23483: _execute() done 30583 1726853792.23485: dumping result to json 30583 1726853792.23487: done dumping result, returning 30583 1726853792.23494: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-000000002699] 30583 1726853792.23498: sending task result for task 02083763-bbaf-05ea-abc5-000000002699 30583 1726853792.23590: done sending task result for task 02083763-bbaf-05ea-abc5-000000002699 30583 1726853792.23593: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853792.23643: no more pending results, returning what we have 30583 1726853792.23647: results queue empty 30583 1726853792.23648: checking for any_errors_fatal 30583 1726853792.23654: done checking for any_errors_fatal 30583 1726853792.23655: checking for max_fail_percentage 30583 1726853792.23659: done checking for max_fail_percentage 30583 1726853792.23660: checking to see if all hosts have failed and the running result is not ok 30583 1726853792.23661: done checking to see if all hosts have failed 30583 1726853792.23662: getting the remaining hosts for this loop 30583 1726853792.23664: done getting the remaining hosts for this loop 30583 1726853792.23667: getting the next task for host managed_node2 30583 1726853792.23677: done getting next task for host managed_node2 30583 1726853792.23681: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853792.23686: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853792.23718: getting variables 30583 1726853792.23719: in VariableManager get_vars() 30583 1726853792.23768: Calling all_inventory to load vars for managed_node2 30583 1726853792.23776: Calling groups_inventory to load vars for managed_node2 30583 1726853792.23779: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853792.23788: Calling all_plugins_play to load vars for managed_node2 30583 1726853792.23790: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853792.23793: Calling groups_plugins_play to load vars for managed_node2 30583 1726853792.24739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853792.25618: done with get_vars() 30583 1726853792.25636: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30583 1726853792.25693: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:36:32 -0400 (0:00:00.066) 0:02:07.594 ****** 30583 1726853792.25718: entering _queue_task() for managed_node2/yum 30583 1726853792.25994: worker is 1 (out of 1 available) 30583 1726853792.26009: exiting _queue_task() for managed_node2/yum 30583 1726853792.26023: done queuing things up, now waiting for results queue to drain 30583 1726853792.26024: waiting for pending results... 30583 1726853792.26222: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30583 1726853792.26326: in run() - task 02083763-bbaf-05ea-abc5-00000000269a 30583 1726853792.26336: variable 'ansible_search_path' from source: unknown 30583 1726853792.26339: variable 'ansible_search_path' from source: unknown 30583 1726853792.26376: calling self._execute() 30583 1726853792.26447: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.26450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.26461: variable 'omit' from source: magic vars 30583 1726853792.26746: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.26755: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853792.26883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853792.28412: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853792.28468: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853792.28495: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853792.28524: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853792.28544: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853792.28603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.28623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.28643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.28673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.28684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.28754: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.28772: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30583 1726853792.28775: when evaluation is False, skipping this task 30583 1726853792.28778: _execute() done 30583 1726853792.28780: dumping result to json 30583 1726853792.28783: done dumping result, returning 30583 1726853792.28790: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-00000000269a] 30583 1726853792.28793: sending task result for task 02083763-bbaf-05ea-abc5-00000000269a 30583 1726853792.28883: done sending task result for task 02083763-bbaf-05ea-abc5-00000000269a 30583 1726853792.28887: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30583 1726853792.28934: no more pending results, returning what we have 30583 1726853792.28937: results queue empty 30583 1726853792.28938: checking for any_errors_fatal 30583 1726853792.28943: done checking for any_errors_fatal 30583 1726853792.28944: checking for max_fail_percentage 30583 1726853792.28946: done checking for max_fail_percentage 30583 1726853792.28947: checking to see if all hosts have failed and the running result is not ok 30583 1726853792.28947: done checking to see if all hosts have failed 30583 1726853792.28948: getting the remaining hosts for this loop 30583 1726853792.28950: done getting the remaining hosts for this loop 30583 1726853792.28954: getting the next task for host managed_node2 30583 1726853792.28962: done getting next task for host managed_node2 30583 1726853792.28966: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853792.28976: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853792.29007: getting variables 30583 1726853792.29008: in VariableManager get_vars() 30583 1726853792.29053: Calling all_inventory to load vars for managed_node2 30583 1726853792.29056: Calling groups_inventory to load vars for managed_node2 30583 1726853792.29058: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853792.29066: Calling all_plugins_play to load vars for managed_node2 30583 1726853792.29069: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853792.29073: Calling groups_plugins_play to load vars for managed_node2 30583 1726853792.29880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853792.35138: done with get_vars() 30583 1726853792.35158: done getting variables 30583 1726853792.35195: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:36:32 -0400 (0:00:00.094) 0:02:07.689 ****** 30583 1726853792.35217: entering _queue_task() for managed_node2/fail 30583 1726853792.35495: worker is 1 (out of 1 available) 30583 1726853792.35509: exiting _queue_task() for managed_node2/fail 30583 1726853792.35521: done queuing things up, now waiting for results queue to drain 30583 1726853792.35523: waiting for pending results... 30583 1726853792.35732: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30583 1726853792.35846: in run() - task 02083763-bbaf-05ea-abc5-00000000269b 30583 1726853792.35861: variable 'ansible_search_path' from source: unknown 30583 1726853792.35866: variable 'ansible_search_path' from source: unknown 30583 1726853792.35891: calling self._execute() 30583 1726853792.35972: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.35977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.35986: variable 'omit' from source: magic vars 30583 1726853792.36268: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.36279: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853792.36366: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853792.36498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853792.37997: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853792.38049: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853792.38079: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853792.38106: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853792.38128: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853792.38188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.38210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.38228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.38256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.38268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.38302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.38319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.38335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.38363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.38373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.38401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.38418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.38434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.38461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.38473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.38588: variable 'network_connections' from source: include params 30583 1726853792.38598: variable 'interface' from source: play vars 30583 1726853792.38646: variable 'interface' from source: play vars 30583 1726853792.38698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853792.38812: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853792.38839: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853792.38864: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853792.38884: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853792.38914: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853792.38930: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853792.38946: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.38966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853792.39005: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853792.39155: variable 'network_connections' from source: include params 30583 1726853792.39161: variable 'interface' from source: play vars 30583 1726853792.39206: variable 'interface' from source: play vars 30583 1726853792.39225: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853792.39229: when evaluation is False, skipping this task 30583 1726853792.39232: _execute() done 30583 1726853792.39235: dumping result to json 30583 1726853792.39237: done dumping result, returning 30583 1726853792.39243: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-00000000269b] 30583 1726853792.39248: sending task result for task 02083763-bbaf-05ea-abc5-00000000269b 30583 1726853792.39343: done sending task result for task 02083763-bbaf-05ea-abc5-00000000269b 30583 1726853792.39346: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853792.39399: no more pending results, returning what we have 30583 1726853792.39403: results queue empty 30583 1726853792.39404: checking for any_errors_fatal 30583 1726853792.39410: done checking for any_errors_fatal 30583 1726853792.39410: checking for max_fail_percentage 30583 1726853792.39412: done checking for max_fail_percentage 30583 1726853792.39413: checking to see if all hosts have failed and the running result is not ok 30583 1726853792.39414: done checking to see if all hosts have failed 30583 1726853792.39414: getting the remaining hosts for this loop 30583 1726853792.39416: done getting the remaining hosts for this loop 30583 1726853792.39420: getting the next task for host managed_node2 30583 1726853792.39427: done getting next task for host managed_node2 30583 1726853792.39431: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30583 1726853792.39437: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853792.39474: getting variables 30583 1726853792.39476: in VariableManager get_vars() 30583 1726853792.39518: Calling all_inventory to load vars for managed_node2 30583 1726853792.39521: Calling groups_inventory to load vars for managed_node2 30583 1726853792.39523: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853792.39531: Calling all_plugins_play to load vars for managed_node2 30583 1726853792.39533: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853792.39536: Calling groups_plugins_play to load vars for managed_node2 30583 1726853792.40348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853792.41227: done with get_vars() 30583 1726853792.41243: done getting variables 30583 1726853792.41289: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:36:32 -0400 (0:00:00.060) 0:02:07.750 ****** 30583 1726853792.41316: entering _queue_task() for managed_node2/package 30583 1726853792.41577: worker is 1 (out of 1 available) 30583 1726853792.41591: exiting _queue_task() for managed_node2/package 30583 1726853792.41604: done queuing things up, now waiting for results queue to drain 30583 1726853792.41606: waiting for pending results... 30583 1726853792.41794: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30583 1726853792.41906: in run() - task 02083763-bbaf-05ea-abc5-00000000269c 30583 1726853792.41915: variable 'ansible_search_path' from source: unknown 30583 1726853792.41918: variable 'ansible_search_path' from source: unknown 30583 1726853792.41949: calling self._execute() 30583 1726853792.42026: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.42030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.42038: variable 'omit' from source: magic vars 30583 1726853792.42321: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.42331: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853792.42461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853792.42653: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853792.42687: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853792.42713: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853792.42768: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853792.42853: variable 'network_packages' from source: role '' defaults 30583 1726853792.42927: variable '__network_provider_setup' from source: role '' defaults 30583 1726853792.42936: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853792.42981: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853792.42989: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853792.43077: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853792.43149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853792.44473: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853792.44516: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853792.44543: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853792.44572: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853792.44592: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853792.44917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.44937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.44953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.44983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.44995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.45025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.45042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.45060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.45085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.45101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.45234: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853792.45304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.45322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.45339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.45361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.45378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.45439: variable 'ansible_python' from source: facts 30583 1726853792.45452: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853792.45507: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853792.45564: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853792.45645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.45661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.45678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.45701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.45711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.45742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.45763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.45782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.45805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.45816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.45912: variable 'network_connections' from source: include params 30583 1726853792.45915: variable 'interface' from source: play vars 30583 1726853792.45988: variable 'interface' from source: play vars 30583 1726853792.46035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853792.46053: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853792.46075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.46101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853792.46137: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853792.46314: variable 'network_connections' from source: include params 30583 1726853792.46318: variable 'interface' from source: play vars 30583 1726853792.46388: variable 'interface' from source: play vars 30583 1726853792.46414: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853792.46477: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853792.46662: variable 'network_connections' from source: include params 30583 1726853792.46666: variable 'interface' from source: play vars 30583 1726853792.46711: variable 'interface' from source: play vars 30583 1726853792.46727: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853792.46783: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853792.46976: variable 'network_connections' from source: include params 30583 1726853792.46980: variable 'interface' from source: play vars 30583 1726853792.47024: variable 'interface' from source: play vars 30583 1726853792.47063: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853792.47104: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853792.47109: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853792.47150: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853792.47279: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853792.47574: variable 'network_connections' from source: include params 30583 1726853792.47577: variable 'interface' from source: play vars 30583 1726853792.47619: variable 'interface' from source: play vars 30583 1726853792.47625: variable 'ansible_distribution' from source: facts 30583 1726853792.47628: variable '__network_rh_distros' from source: role '' defaults 30583 1726853792.47635: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.47645: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853792.47748: variable 'ansible_distribution' from source: facts 30583 1726853792.47752: variable '__network_rh_distros' from source: role '' defaults 30583 1726853792.47754: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.47767: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853792.47874: variable 'ansible_distribution' from source: facts 30583 1726853792.47877: variable '__network_rh_distros' from source: role '' defaults 30583 1726853792.47882: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.47906: variable 'network_provider' from source: set_fact 30583 1726853792.47918: variable 'ansible_facts' from source: unknown 30583 1726853792.48284: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30583 1726853792.48288: when evaluation is False, skipping this task 30583 1726853792.48290: _execute() done 30583 1726853792.48293: dumping result to json 30583 1726853792.48296: done dumping result, returning 30583 1726853792.48303: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-05ea-abc5-00000000269c] 30583 1726853792.48306: sending task result for task 02083763-bbaf-05ea-abc5-00000000269c 30583 1726853792.48402: done sending task result for task 02083763-bbaf-05ea-abc5-00000000269c 30583 1726853792.48405: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30583 1726853792.48454: no more pending results, returning what we have 30583 1726853792.48460: results queue empty 30583 1726853792.48461: checking for any_errors_fatal 30583 1726853792.48467: done checking for any_errors_fatal 30583 1726853792.48468: checking for max_fail_percentage 30583 1726853792.48470: done checking for max_fail_percentage 30583 1726853792.48472: checking to see if all hosts have failed and the running result is not ok 30583 1726853792.48473: done checking to see if all hosts have failed 30583 1726853792.48474: getting the remaining hosts for this loop 30583 1726853792.48476: done getting the remaining hosts for this loop 30583 1726853792.48480: getting the next task for host managed_node2 30583 1726853792.48488: done getting next task for host managed_node2 30583 1726853792.48492: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853792.48498: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853792.48527: getting variables 30583 1726853792.48529: in VariableManager get_vars() 30583 1726853792.48586: Calling all_inventory to load vars for managed_node2 30583 1726853792.48589: Calling groups_inventory to load vars for managed_node2 30583 1726853792.48591: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853792.48600: Calling all_plugins_play to load vars for managed_node2 30583 1726853792.48602: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853792.48605: Calling groups_plugins_play to load vars for managed_node2 30583 1726853792.49575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853792.50433: done with get_vars() 30583 1726853792.50449: done getting variables 30583 1726853792.50493: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:36:32 -0400 (0:00:00.091) 0:02:07.842 ****** 30583 1726853792.50517: entering _queue_task() for managed_node2/package 30583 1726853792.50755: worker is 1 (out of 1 available) 30583 1726853792.50772: exiting _queue_task() for managed_node2/package 30583 1726853792.50785: done queuing things up, now waiting for results queue to drain 30583 1726853792.50787: waiting for pending results... 30583 1726853792.50974: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30583 1726853792.51097: in run() - task 02083763-bbaf-05ea-abc5-00000000269d 30583 1726853792.51110: variable 'ansible_search_path' from source: unknown 30583 1726853792.51115: variable 'ansible_search_path' from source: unknown 30583 1726853792.51143: calling self._execute() 30583 1726853792.51228: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.51231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.51240: variable 'omit' from source: magic vars 30583 1726853792.51526: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.51534: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853792.51627: variable 'network_state' from source: role '' defaults 30583 1726853792.51635: Evaluated conditional (network_state != {}): False 30583 1726853792.51638: when evaluation is False, skipping this task 30583 1726853792.51641: _execute() done 30583 1726853792.51643: dumping result to json 30583 1726853792.51647: done dumping result, returning 30583 1726853792.51673: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-05ea-abc5-00000000269d] 30583 1726853792.51676: sending task result for task 02083763-bbaf-05ea-abc5-00000000269d 30583 1726853792.51750: done sending task result for task 02083763-bbaf-05ea-abc5-00000000269d 30583 1726853792.51753: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853792.51814: no more pending results, returning what we have 30583 1726853792.51818: results queue empty 30583 1726853792.51819: checking for any_errors_fatal 30583 1726853792.51825: done checking for any_errors_fatal 30583 1726853792.51826: checking for max_fail_percentage 30583 1726853792.51828: done checking for max_fail_percentage 30583 1726853792.51829: checking to see if all hosts have failed and the running result is not ok 30583 1726853792.51830: done checking to see if all hosts have failed 30583 1726853792.51830: getting the remaining hosts for this loop 30583 1726853792.51832: done getting the remaining hosts for this loop 30583 1726853792.51837: getting the next task for host managed_node2 30583 1726853792.51845: done getting next task for host managed_node2 30583 1726853792.51848: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853792.51853: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853792.51880: getting variables 30583 1726853792.51882: in VariableManager get_vars() 30583 1726853792.51917: Calling all_inventory to load vars for managed_node2 30583 1726853792.51919: Calling groups_inventory to load vars for managed_node2 30583 1726853792.51921: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853792.51929: Calling all_plugins_play to load vars for managed_node2 30583 1726853792.51931: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853792.51933: Calling groups_plugins_play to load vars for managed_node2 30583 1726853792.52676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853792.53540: done with get_vars() 30583 1726853792.53556: done getting variables 30583 1726853792.53600: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:36:32 -0400 (0:00:00.031) 0:02:07.873 ****** 30583 1726853792.53624: entering _queue_task() for managed_node2/package 30583 1726853792.53841: worker is 1 (out of 1 available) 30583 1726853792.53853: exiting _queue_task() for managed_node2/package 30583 1726853792.53867: done queuing things up, now waiting for results queue to drain 30583 1726853792.53868: waiting for pending results... 30583 1726853792.54053: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30583 1726853792.54168: in run() - task 02083763-bbaf-05ea-abc5-00000000269e 30583 1726853792.54180: variable 'ansible_search_path' from source: unknown 30583 1726853792.54184: variable 'ansible_search_path' from source: unknown 30583 1726853792.54214: calling self._execute() 30583 1726853792.54293: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.54296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.54305: variable 'omit' from source: magic vars 30583 1726853792.54577: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.54586: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853792.54676: variable 'network_state' from source: role '' defaults 30583 1726853792.54685: Evaluated conditional (network_state != {}): False 30583 1726853792.54688: when evaluation is False, skipping this task 30583 1726853792.54691: _execute() done 30583 1726853792.54693: dumping result to json 30583 1726853792.54696: done dumping result, returning 30583 1726853792.54704: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-05ea-abc5-00000000269e] 30583 1726853792.54707: sending task result for task 02083763-bbaf-05ea-abc5-00000000269e 30583 1726853792.54794: done sending task result for task 02083763-bbaf-05ea-abc5-00000000269e 30583 1726853792.54797: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853792.54841: no more pending results, returning what we have 30583 1726853792.54845: results queue empty 30583 1726853792.54847: checking for any_errors_fatal 30583 1726853792.54853: done checking for any_errors_fatal 30583 1726853792.54853: checking for max_fail_percentage 30583 1726853792.54855: done checking for max_fail_percentage 30583 1726853792.54856: checking to see if all hosts have failed and the running result is not ok 30583 1726853792.54857: done checking to see if all hosts have failed 30583 1726853792.54858: getting the remaining hosts for this loop 30583 1726853792.54860: done getting the remaining hosts for this loop 30583 1726853792.54863: getting the next task for host managed_node2 30583 1726853792.54873: done getting next task for host managed_node2 30583 1726853792.54876: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853792.54881: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853792.54904: getting variables 30583 1726853792.54906: in VariableManager get_vars() 30583 1726853792.54942: Calling all_inventory to load vars for managed_node2 30583 1726853792.54944: Calling groups_inventory to load vars for managed_node2 30583 1726853792.54946: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853792.54954: Calling all_plugins_play to load vars for managed_node2 30583 1726853792.54956: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853792.54959: Calling groups_plugins_play to load vars for managed_node2 30583 1726853792.55844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853792.56687: done with get_vars() 30583 1726853792.56702: done getting variables 30583 1726853792.56742: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:36:32 -0400 (0:00:00.031) 0:02:07.904 ****** 30583 1726853792.56767: entering _queue_task() for managed_node2/service 30583 1726853792.56966: worker is 1 (out of 1 available) 30583 1726853792.56980: exiting _queue_task() for managed_node2/service 30583 1726853792.56993: done queuing things up, now waiting for results queue to drain 30583 1726853792.56994: waiting for pending results... 30583 1726853792.57215: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30583 1726853792.57315: in run() - task 02083763-bbaf-05ea-abc5-00000000269f 30583 1726853792.57327: variable 'ansible_search_path' from source: unknown 30583 1726853792.57331: variable 'ansible_search_path' from source: unknown 30583 1726853792.57356: calling self._execute() 30583 1726853792.57435: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.57441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.57448: variable 'omit' from source: magic vars 30583 1726853792.57725: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.57734: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853792.57824: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853792.57954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853792.59460: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853792.59517: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853792.59545: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853792.59575: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853792.59596: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853792.59652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.59676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.59693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.59719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.59731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.59765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.59783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.59799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.59825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.59834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.59867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.59885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.59901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.59925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.59936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.60051: variable 'network_connections' from source: include params 30583 1726853792.60062: variable 'interface' from source: play vars 30583 1726853792.60111: variable 'interface' from source: play vars 30583 1726853792.60159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853792.60274: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853792.60310: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853792.60332: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853792.60353: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853792.60387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853792.60403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853792.60420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.60437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853792.60478: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853792.60631: variable 'network_connections' from source: include params 30583 1726853792.60635: variable 'interface' from source: play vars 30583 1726853792.60681: variable 'interface' from source: play vars 30583 1726853792.60702: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30583 1726853792.60705: when evaluation is False, skipping this task 30583 1726853792.60708: _execute() done 30583 1726853792.60710: dumping result to json 30583 1726853792.60712: done dumping result, returning 30583 1726853792.60719: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-05ea-abc5-00000000269f] 30583 1726853792.60727: sending task result for task 02083763-bbaf-05ea-abc5-00000000269f 30583 1726853792.60806: done sending task result for task 02083763-bbaf-05ea-abc5-00000000269f 30583 1726853792.60816: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30583 1726853792.60862: no more pending results, returning what we have 30583 1726853792.60865: results queue empty 30583 1726853792.60866: checking for any_errors_fatal 30583 1726853792.60874: done checking for any_errors_fatal 30583 1726853792.60875: checking for max_fail_percentage 30583 1726853792.60877: done checking for max_fail_percentage 30583 1726853792.60877: checking to see if all hosts have failed and the running result is not ok 30583 1726853792.60878: done checking to see if all hosts have failed 30583 1726853792.60879: getting the remaining hosts for this loop 30583 1726853792.60881: done getting the remaining hosts for this loop 30583 1726853792.60884: getting the next task for host managed_node2 30583 1726853792.60892: done getting next task for host managed_node2 30583 1726853792.60897: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853792.60902: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853792.60930: getting variables 30583 1726853792.60932: in VariableManager get_vars() 30583 1726853792.60979: Calling all_inventory to load vars for managed_node2 30583 1726853792.60982: Calling groups_inventory to load vars for managed_node2 30583 1726853792.60984: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853792.60992: Calling all_plugins_play to load vars for managed_node2 30583 1726853792.60995: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853792.60997: Calling groups_plugins_play to load vars for managed_node2 30583 1726853792.61788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853792.62656: done with get_vars() 30583 1726853792.62674: done getting variables 30583 1726853792.62714: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:36:32 -0400 (0:00:00.059) 0:02:07.964 ****** 30583 1726853792.62738: entering _queue_task() for managed_node2/service 30583 1726853792.62964: worker is 1 (out of 1 available) 30583 1726853792.62979: exiting _queue_task() for managed_node2/service 30583 1726853792.62992: done queuing things up, now waiting for results queue to drain 30583 1726853792.62993: waiting for pending results... 30583 1726853792.63185: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30583 1726853792.63290: in run() - task 02083763-bbaf-05ea-abc5-0000000026a0 30583 1726853792.63302: variable 'ansible_search_path' from source: unknown 30583 1726853792.63306: variable 'ansible_search_path' from source: unknown 30583 1726853792.63334: calling self._execute() 30583 1726853792.63410: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.63413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.63422: variable 'omit' from source: magic vars 30583 1726853792.63708: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.63717: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853792.63833: variable 'network_provider' from source: set_fact 30583 1726853792.63838: variable 'network_state' from source: role '' defaults 30583 1726853792.63848: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30583 1726853792.63854: variable 'omit' from source: magic vars 30583 1726853792.63898: variable 'omit' from source: magic vars 30583 1726853792.63918: variable 'network_service_name' from source: role '' defaults 30583 1726853792.63964: variable 'network_service_name' from source: role '' defaults 30583 1726853792.64037: variable '__network_provider_setup' from source: role '' defaults 30583 1726853792.64041: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853792.64090: variable '__network_service_name_default_nm' from source: role '' defaults 30583 1726853792.64098: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853792.64145: variable '__network_packages_default_nm' from source: role '' defaults 30583 1726853792.64295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853792.65984: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853792.66032: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853792.66061: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853792.66089: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853792.66109: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853792.66170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.66193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.66210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.66236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.66247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.66285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.66301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.66317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.66341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.66351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.66500: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30583 1726853792.66574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.66594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.66610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.66633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.66644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.66709: variable 'ansible_python' from source: facts 30583 1726853792.66721: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30583 1726853792.66778: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853792.66832: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853792.66918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.66934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.66950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.66978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.66989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.67022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853792.67042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853792.67058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.67087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853792.67097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853792.67193: variable 'network_connections' from source: include params 30583 1726853792.67200: variable 'interface' from source: play vars 30583 1726853792.67254: variable 'interface' from source: play vars 30583 1726853792.67323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853792.67452: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853792.67492: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853792.67523: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853792.67552: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853792.67599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853792.67619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853792.67640: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853792.67666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853792.67705: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853792.67883: variable 'network_connections' from source: include params 30583 1726853792.67887: variable 'interface' from source: play vars 30583 1726853792.67939: variable 'interface' from source: play vars 30583 1726853792.67962: variable '__network_packages_default_wireless' from source: role '' defaults 30583 1726853792.68019: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853792.68203: variable 'network_connections' from source: include params 30583 1726853792.68206: variable 'interface' from source: play vars 30583 1726853792.68257: variable 'interface' from source: play vars 30583 1726853792.68277: variable '__network_packages_default_team' from source: role '' defaults 30583 1726853792.68330: variable '__network_team_connections_defined' from source: role '' defaults 30583 1726853792.68513: variable 'network_connections' from source: include params 30583 1726853792.68517: variable 'interface' from source: play vars 30583 1726853792.68569: variable 'interface' from source: play vars 30583 1726853792.68605: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853792.68648: variable '__network_service_name_default_initscripts' from source: role '' defaults 30583 1726853792.68651: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853792.68699: variable '__network_packages_default_initscripts' from source: role '' defaults 30583 1726853792.68831: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30583 1726853792.69140: variable 'network_connections' from source: include params 30583 1726853792.69144: variable 'interface' from source: play vars 30583 1726853792.69190: variable 'interface' from source: play vars 30583 1726853792.69198: variable 'ansible_distribution' from source: facts 30583 1726853792.69201: variable '__network_rh_distros' from source: role '' defaults 30583 1726853792.69208: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.69217: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30583 1726853792.69327: variable 'ansible_distribution' from source: facts 30583 1726853792.69331: variable '__network_rh_distros' from source: role '' defaults 30583 1726853792.69335: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.69346: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30583 1726853792.69465: variable 'ansible_distribution' from source: facts 30583 1726853792.69468: variable '__network_rh_distros' from source: role '' defaults 30583 1726853792.69474: variable 'ansible_distribution_major_version' from source: facts 30583 1726853792.69499: variable 'network_provider' from source: set_fact 30583 1726853792.69516: variable 'omit' from source: magic vars 30583 1726853792.69539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853792.69562: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853792.69578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853792.69591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853792.69599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853792.69622: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853792.69624: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.69627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.69696: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853792.69702: Set connection var ansible_timeout to 10 30583 1726853792.69704: Set connection var ansible_connection to ssh 30583 1726853792.69709: Set connection var ansible_shell_executable to /bin/sh 30583 1726853792.69711: Set connection var ansible_shell_type to sh 30583 1726853792.69719: Set connection var ansible_pipelining to False 30583 1726853792.69738: variable 'ansible_shell_executable' from source: unknown 30583 1726853792.69740: variable 'ansible_connection' from source: unknown 30583 1726853792.69743: variable 'ansible_module_compression' from source: unknown 30583 1726853792.69747: variable 'ansible_shell_type' from source: unknown 30583 1726853792.69749: variable 'ansible_shell_executable' from source: unknown 30583 1726853792.69751: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853792.69753: variable 'ansible_pipelining' from source: unknown 30583 1726853792.69755: variable 'ansible_timeout' from source: unknown 30583 1726853792.69766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853792.69836: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853792.69844: variable 'omit' from source: magic vars 30583 1726853792.69852: starting attempt loop 30583 1726853792.69855: running the handler 30583 1726853792.69909: variable 'ansible_facts' from source: unknown 30583 1726853792.70316: _low_level_execute_command(): starting 30583 1726853792.70322: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853792.70819: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853792.70823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853792.70826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853792.70828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853792.70831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853792.70876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853792.70879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853792.70891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853792.70977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853792.72706: stdout chunk (state=3): >>>/root <<< 30583 1726853792.72803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853792.72829: stderr chunk (state=3): >>><<< 30583 1726853792.72832: stdout chunk (state=3): >>><<< 30583 1726853792.72848: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853792.72857: _low_level_execute_command(): starting 30583 1726853792.72864: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282 `" && echo ansible-tmp-1726853792.728474-36021-105607575974282="` echo /root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282 `" ) && sleep 0' 30583 1726853792.73280: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853792.73283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853792.73286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853792.73288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853792.73340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853792.73346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853792.73349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853792.73416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853792.75424: stdout chunk (state=3): >>>ansible-tmp-1726853792.728474-36021-105607575974282=/root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282 <<< 30583 1726853792.75530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853792.75560: stderr chunk (state=3): >>><<< 30583 1726853792.75563: stdout chunk (state=3): >>><<< 30583 1726853792.75576: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853792.728474-36021-105607575974282=/root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853792.75601: variable 'ansible_module_compression' from source: unknown 30583 1726853792.75641: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30583 1726853792.75691: variable 'ansible_facts' from source: unknown 30583 1726853792.75825: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282/AnsiballZ_systemd.py 30583 1726853792.75927: Sending initial data 30583 1726853792.75930: Sent initial data (155 bytes) 30583 1726853792.76373: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853792.76376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853792.76382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853792.76385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853792.76387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853792.76389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853792.76433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853792.76443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853792.76445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853792.76506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853792.78174: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853792.78239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853792.78308: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpis19zqsq /root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282/AnsiballZ_systemd.py <<< 30583 1726853792.78314: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282/AnsiballZ_systemd.py" <<< 30583 1726853792.78382: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpis19zqsq" to remote "/root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282/AnsiballZ_systemd.py" <<< 30583 1726853792.79577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853792.79613: stderr chunk (state=3): >>><<< 30583 1726853792.79617: stdout chunk (state=3): >>><<< 30583 1726853792.79641: done transferring module to remote 30583 1726853792.79649: _low_level_execute_command(): starting 30583 1726853792.79653: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282/ /root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282/AnsiballZ_systemd.py && sleep 0' 30583 1726853792.80080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853792.80083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853792.80085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853792.80087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853792.80089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853792.80135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853792.80138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853792.80212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853792.82119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853792.82132: stderr chunk (state=3): >>><<< 30583 1726853792.82151: stdout chunk (state=3): >>><<< 30583 1726853792.82244: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853792.82247: _low_level_execute_command(): starting 30583 1726853792.82250: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282/AnsiballZ_systemd.py && sleep 0' 30583 1726853792.82606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853792.82618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853792.82632: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853792.82679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853792.82699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853792.82777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853793.12585: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4616192", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3301556224", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2070541000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 30583 1726853793.12606: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "syst<<< 30583 1726853793.12616: stdout chunk (state=3): >>>em.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30583 1726853793.14559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853793.14591: stderr chunk (state=3): >>><<< 30583 1726853793.14594: stdout chunk (state=3): >>><<< 30583 1726853793.14611: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6954", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ExecMainStartTimestampMonotonic": "354241069", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ExecMainHandoffTimestampMonotonic": "354259688", "ExecMainPID": "6954", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4616192", "MemoryPeak": "8294400", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3301556224", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2070541000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "system.slice dbus-broker.service basic.target sysinit.target cloud-init-local.service systemd-journald.socket dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:36 EDT", "StateChangeTimestampMonotonic": "466727849", "InactiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveExitTimestampMonotonic": "354242944", "ActiveEnterTimestamp": "Fri 2024-09-20 13:25:44 EDT", "ActiveEnterTimestampMonotonic": "354340344", "ActiveExitTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ActiveExitTimestampMonotonic": "354211543", "InactiveEnterTimestamp": "Fri 2024-09-20 13:25:43 EDT", "InactiveEnterTimestampMonotonic": "354237904", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:25:43 EDT", "ConditionTimestampMonotonic": "354239203", "AssertTimestamp": "Fri 2024-09-20 13:25:43 EDT", "AssertTimestampMonotonic": "354239218", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6c038df3c47d4ceeb77d538416d0146a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853793.14738: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853793.14753: _low_level_execute_command(): starting 30583 1726853793.14758: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853792.728474-36021-105607575974282/ > /dev/null 2>&1 && sleep 0' 30583 1726853793.15207: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853793.15210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.15212: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853793.15214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853793.15216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.15269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853793.15277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853793.15280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853793.15341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853793.17223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853793.17248: stderr chunk (state=3): >>><<< 30583 1726853793.17251: stdout chunk (state=3): >>><<< 30583 1726853793.17269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853793.17275: handler run complete 30583 1726853793.17315: attempt loop complete, returning result 30583 1726853793.17318: _execute() done 30583 1726853793.17320: dumping result to json 30583 1726853793.17332: done dumping result, returning 30583 1726853793.17341: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-05ea-abc5-0000000026a0] 30583 1726853793.17345: sending task result for task 02083763-bbaf-05ea-abc5-0000000026a0 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853793.17875: no more pending results, returning what we have 30583 1726853793.17878: results queue empty 30583 1726853793.17879: checking for any_errors_fatal 30583 1726853793.17881: done checking for any_errors_fatal 30583 1726853793.17882: checking for max_fail_percentage 30583 1726853793.17883: done checking for max_fail_percentage 30583 1726853793.17883: checking to see if all hosts have failed and the running result is not ok 30583 1726853793.17884: done checking to see if all hosts have failed 30583 1726853793.17884: getting the remaining hosts for this loop 30583 1726853793.17885: done getting the remaining hosts for this loop 30583 1726853793.17888: getting the next task for host managed_node2 30583 1726853793.17894: done getting next task for host managed_node2 30583 1726853793.17897: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853793.17902: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853793.17911: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026a0 30583 1726853793.17914: WORKER PROCESS EXITING 30583 1726853793.17920: getting variables 30583 1726853793.17921: in VariableManager get_vars() 30583 1726853793.17946: Calling all_inventory to load vars for managed_node2 30583 1726853793.17948: Calling groups_inventory to load vars for managed_node2 30583 1726853793.17950: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853793.17956: Calling all_plugins_play to load vars for managed_node2 30583 1726853793.17958: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853793.17960: Calling groups_plugins_play to load vars for managed_node2 30583 1726853793.18623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853793.19482: done with get_vars() 30583 1726853793.19498: done getting variables 30583 1726853793.19541: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:36:33 -0400 (0:00:00.568) 0:02:08.532 ****** 30583 1726853793.19572: entering _queue_task() for managed_node2/service 30583 1726853793.19810: worker is 1 (out of 1 available) 30583 1726853793.19823: exiting _queue_task() for managed_node2/service 30583 1726853793.19837: done queuing things up, now waiting for results queue to drain 30583 1726853793.19838: waiting for pending results... 30583 1726853793.20030: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30583 1726853793.20139: in run() - task 02083763-bbaf-05ea-abc5-0000000026a1 30583 1726853793.20149: variable 'ansible_search_path' from source: unknown 30583 1726853793.20152: variable 'ansible_search_path' from source: unknown 30583 1726853793.20189: calling self._execute() 30583 1726853793.20264: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.20268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.20279: variable 'omit' from source: magic vars 30583 1726853793.20561: variable 'ansible_distribution_major_version' from source: facts 30583 1726853793.20568: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853793.20651: variable 'network_provider' from source: set_fact 30583 1726853793.20656: Evaluated conditional (network_provider == "nm"): True 30583 1726853793.20722: variable '__network_wpa_supplicant_required' from source: role '' defaults 30583 1726853793.20791: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30583 1726853793.20908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853793.22334: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853793.22383: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853793.22412: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853793.22437: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853793.22457: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853793.22637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853793.22660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853793.22678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853793.22706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853793.22718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853793.22749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853793.22766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853793.22784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853793.22811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853793.22823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853793.22849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853793.22866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853793.22884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853793.22909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853793.22920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853793.23019: variable 'network_connections' from source: include params 30583 1726853793.23028: variable 'interface' from source: play vars 30583 1726853793.23076: variable 'interface' from source: play vars 30583 1726853793.23126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30583 1726853793.23234: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30583 1726853793.23261: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30583 1726853793.23283: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30583 1726853793.23304: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30583 1726853793.23335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30583 1726853793.23352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30583 1726853793.23372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853793.23389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30583 1726853793.23426: variable '__network_wireless_connections_defined' from source: role '' defaults 30583 1726853793.23579: variable 'network_connections' from source: include params 30583 1726853793.23583: variable 'interface' from source: play vars 30583 1726853793.23625: variable 'interface' from source: play vars 30583 1726853793.23646: Evaluated conditional (__network_wpa_supplicant_required): False 30583 1726853793.23650: when evaluation is False, skipping this task 30583 1726853793.23653: _execute() done 30583 1726853793.23655: dumping result to json 30583 1726853793.23661: done dumping result, returning 30583 1726853793.23665: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-05ea-abc5-0000000026a1] 30583 1726853793.23678: sending task result for task 02083763-bbaf-05ea-abc5-0000000026a1 30583 1726853793.23760: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026a1 30583 1726853793.23762: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30583 1726853793.23830: no more pending results, returning what we have 30583 1726853793.23833: results queue empty 30583 1726853793.23835: checking for any_errors_fatal 30583 1726853793.23862: done checking for any_errors_fatal 30583 1726853793.23863: checking for max_fail_percentage 30583 1726853793.23866: done checking for max_fail_percentage 30583 1726853793.23866: checking to see if all hosts have failed and the running result is not ok 30583 1726853793.23867: done checking to see if all hosts have failed 30583 1726853793.23868: getting the remaining hosts for this loop 30583 1726853793.23870: done getting the remaining hosts for this loop 30583 1726853793.23875: getting the next task for host managed_node2 30583 1726853793.23883: done getting next task for host managed_node2 30583 1726853793.23887: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853793.23892: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853793.23916: getting variables 30583 1726853793.23918: in VariableManager get_vars() 30583 1726853793.23961: Calling all_inventory to load vars for managed_node2 30583 1726853793.23964: Calling groups_inventory to load vars for managed_node2 30583 1726853793.23966: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853793.23979: Calling all_plugins_play to load vars for managed_node2 30583 1726853793.23982: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853793.23985: Calling groups_plugins_play to load vars for managed_node2 30583 1726853793.24861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853793.25741: done with get_vars() 30583 1726853793.25756: done getting variables 30583 1726853793.25801: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:36:33 -0400 (0:00:00.062) 0:02:08.595 ****** 30583 1726853793.25826: entering _queue_task() for managed_node2/service 30583 1726853793.26070: worker is 1 (out of 1 available) 30583 1726853793.26084: exiting _queue_task() for managed_node2/service 30583 1726853793.26097: done queuing things up, now waiting for results queue to drain 30583 1726853793.26098: waiting for pending results... 30583 1726853793.26279: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30583 1726853793.26356: in run() - task 02083763-bbaf-05ea-abc5-0000000026a2 30583 1726853793.26369: variable 'ansible_search_path' from source: unknown 30583 1726853793.26374: variable 'ansible_search_path' from source: unknown 30583 1726853793.26401: calling self._execute() 30583 1726853793.26488: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.26492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.26501: variable 'omit' from source: magic vars 30583 1726853793.26783: variable 'ansible_distribution_major_version' from source: facts 30583 1726853793.26792: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853793.26873: variable 'network_provider' from source: set_fact 30583 1726853793.26877: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853793.26880: when evaluation is False, skipping this task 30583 1726853793.26882: _execute() done 30583 1726853793.26885: dumping result to json 30583 1726853793.26887: done dumping result, returning 30583 1726853793.26894: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-05ea-abc5-0000000026a2] 30583 1726853793.26896: sending task result for task 02083763-bbaf-05ea-abc5-0000000026a2 30583 1726853793.26984: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026a2 30583 1726853793.26987: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30583 1726853793.27030: no more pending results, returning what we have 30583 1726853793.27034: results queue empty 30583 1726853793.27036: checking for any_errors_fatal 30583 1726853793.27046: done checking for any_errors_fatal 30583 1726853793.27047: checking for max_fail_percentage 30583 1726853793.27049: done checking for max_fail_percentage 30583 1726853793.27049: checking to see if all hosts have failed and the running result is not ok 30583 1726853793.27050: done checking to see if all hosts have failed 30583 1726853793.27051: getting the remaining hosts for this loop 30583 1726853793.27053: done getting the remaining hosts for this loop 30583 1726853793.27056: getting the next task for host managed_node2 30583 1726853793.27067: done getting next task for host managed_node2 30583 1726853793.27070: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853793.27077: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853793.27102: getting variables 30583 1726853793.27103: in VariableManager get_vars() 30583 1726853793.27139: Calling all_inventory to load vars for managed_node2 30583 1726853793.27142: Calling groups_inventory to load vars for managed_node2 30583 1726853793.27144: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853793.27152: Calling all_plugins_play to load vars for managed_node2 30583 1726853793.27154: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853793.27156: Calling groups_plugins_play to load vars for managed_node2 30583 1726853793.27911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853793.28791: done with get_vars() 30583 1726853793.28807: done getting variables 30583 1726853793.28848: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:36:33 -0400 (0:00:00.030) 0:02:08.625 ****** 30583 1726853793.28878: entering _queue_task() for managed_node2/copy 30583 1726853793.29119: worker is 1 (out of 1 available) 30583 1726853793.29134: exiting _queue_task() for managed_node2/copy 30583 1726853793.29147: done queuing things up, now waiting for results queue to drain 30583 1726853793.29148: waiting for pending results... 30583 1726853793.29337: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30583 1726853793.29439: in run() - task 02083763-bbaf-05ea-abc5-0000000026a3 30583 1726853793.29450: variable 'ansible_search_path' from source: unknown 30583 1726853793.29454: variable 'ansible_search_path' from source: unknown 30583 1726853793.29488: calling self._execute() 30583 1726853793.29560: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.29565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.29574: variable 'omit' from source: magic vars 30583 1726853793.29848: variable 'ansible_distribution_major_version' from source: facts 30583 1726853793.29859: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853793.29938: variable 'network_provider' from source: set_fact 30583 1726853793.29943: Evaluated conditional (network_provider == "initscripts"): False 30583 1726853793.29946: when evaluation is False, skipping this task 30583 1726853793.29948: _execute() done 30583 1726853793.29951: dumping result to json 30583 1726853793.29954: done dumping result, returning 30583 1726853793.29978: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-05ea-abc5-0000000026a3] 30583 1726853793.29981: sending task result for task 02083763-bbaf-05ea-abc5-0000000026a3 30583 1726853793.30055: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026a3 30583 1726853793.30060: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30583 1726853793.30110: no more pending results, returning what we have 30583 1726853793.30113: results queue empty 30583 1726853793.30115: checking for any_errors_fatal 30583 1726853793.30121: done checking for any_errors_fatal 30583 1726853793.30122: checking for max_fail_percentage 30583 1726853793.30124: done checking for max_fail_percentage 30583 1726853793.30125: checking to see if all hosts have failed and the running result is not ok 30583 1726853793.30126: done checking to see if all hosts have failed 30583 1726853793.30126: getting the remaining hosts for this loop 30583 1726853793.30128: done getting the remaining hosts for this loop 30583 1726853793.30132: getting the next task for host managed_node2 30583 1726853793.30139: done getting next task for host managed_node2 30583 1726853793.30142: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853793.30147: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853793.30176: getting variables 30583 1726853793.30177: in VariableManager get_vars() 30583 1726853793.30214: Calling all_inventory to load vars for managed_node2 30583 1726853793.30216: Calling groups_inventory to load vars for managed_node2 30583 1726853793.30218: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853793.30225: Calling all_plugins_play to load vars for managed_node2 30583 1726853793.30228: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853793.30230: Calling groups_plugins_play to load vars for managed_node2 30583 1726853793.31129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853793.31991: done with get_vars() 30583 1726853793.32008: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:36:33 -0400 (0:00:00.031) 0:02:08.657 ****** 30583 1726853793.32069: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853793.32304: worker is 1 (out of 1 available) 30583 1726853793.32318: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30583 1726853793.32331: done queuing things up, now waiting for results queue to drain 30583 1726853793.32332: waiting for pending results... 30583 1726853793.32514: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30583 1726853793.32609: in run() - task 02083763-bbaf-05ea-abc5-0000000026a4 30583 1726853793.32622: variable 'ansible_search_path' from source: unknown 30583 1726853793.32625: variable 'ansible_search_path' from source: unknown 30583 1726853793.32651: calling self._execute() 30583 1726853793.32729: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.32734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.32743: variable 'omit' from source: magic vars 30583 1726853793.33020: variable 'ansible_distribution_major_version' from source: facts 30583 1726853793.33029: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853793.33036: variable 'omit' from source: magic vars 30583 1726853793.33079: variable 'omit' from source: magic vars 30583 1726853793.33191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853793.34617: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853793.34664: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853793.34693: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853793.34722: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853793.34743: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853793.34805: variable 'network_provider' from source: set_fact 30583 1726853793.34896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853793.34915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853793.34932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853793.34963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853793.34973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853793.35025: variable 'omit' from source: magic vars 30583 1726853793.35101: variable 'omit' from source: magic vars 30583 1726853793.35169: variable 'network_connections' from source: include params 30583 1726853793.35276: variable 'interface' from source: play vars 30583 1726853793.35279: variable 'interface' from source: play vars 30583 1726853793.35327: variable 'omit' from source: magic vars 30583 1726853793.35334: variable '__lsr_ansible_managed' from source: task vars 30583 1726853793.35377: variable '__lsr_ansible_managed' from source: task vars 30583 1726853793.35510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30583 1726853793.35642: Loaded config def from plugin (lookup/template) 30583 1726853793.35645: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30583 1726853793.35668: File lookup term: get_ansible_managed.j2 30583 1726853793.35672: variable 'ansible_search_path' from source: unknown 30583 1726853793.35676: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30583 1726853793.35686: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30583 1726853793.35699: variable 'ansible_search_path' from source: unknown 30583 1726853793.39085: variable 'ansible_managed' from source: unknown 30583 1726853793.39162: variable 'omit' from source: magic vars 30583 1726853793.39184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853793.39204: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853793.39218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853793.39230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853793.39238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853793.39261: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853793.39264: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.39266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.39329: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853793.39334: Set connection var ansible_timeout to 10 30583 1726853793.39337: Set connection var ansible_connection to ssh 30583 1726853793.39342: Set connection var ansible_shell_executable to /bin/sh 30583 1726853793.39344: Set connection var ansible_shell_type to sh 30583 1726853793.39351: Set connection var ansible_pipelining to False 30583 1726853793.39370: variable 'ansible_shell_executable' from source: unknown 30583 1726853793.39375: variable 'ansible_connection' from source: unknown 30583 1726853793.39377: variable 'ansible_module_compression' from source: unknown 30583 1726853793.39379: variable 'ansible_shell_type' from source: unknown 30583 1726853793.39382: variable 'ansible_shell_executable' from source: unknown 30583 1726853793.39385: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.39388: variable 'ansible_pipelining' from source: unknown 30583 1726853793.39391: variable 'ansible_timeout' from source: unknown 30583 1726853793.39394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.39481: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853793.39493: variable 'omit' from source: magic vars 30583 1726853793.39496: starting attempt loop 30583 1726853793.39498: running the handler 30583 1726853793.39511: _low_level_execute_command(): starting 30583 1726853793.39517: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853793.40006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853793.40010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.40012: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853793.40014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.40065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853793.40068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853793.40074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853793.40158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853793.41911: stdout chunk (state=3): >>>/root <<< 30583 1726853793.42007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853793.42039: stderr chunk (state=3): >>><<< 30583 1726853793.42042: stdout chunk (state=3): >>><<< 30583 1726853793.42060: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853793.42073: _low_level_execute_command(): starting 30583 1726853793.42079: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234 `" && echo ansible-tmp-1726853793.4206202-36035-249689808417234="` echo /root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234 `" ) && sleep 0' 30583 1726853793.42510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853793.42514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853793.42516: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.42518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853793.42520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853793.42522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.42577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853793.42581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853793.42648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853793.44663: stdout chunk (state=3): >>>ansible-tmp-1726853793.4206202-36035-249689808417234=/root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234 <<< 30583 1726853793.44769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853793.44798: stderr chunk (state=3): >>><<< 30583 1726853793.44801: stdout chunk (state=3): >>><<< 30583 1726853793.44817: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853793.4206202-36035-249689808417234=/root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853793.44855: variable 'ansible_module_compression' from source: unknown 30583 1726853793.44889: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30583 1726853793.44930: variable 'ansible_facts' from source: unknown 30583 1726853793.45016: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234/AnsiballZ_network_connections.py 30583 1726853793.45117: Sending initial data 30583 1726853793.45121: Sent initial data (168 bytes) 30583 1726853793.45555: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853793.45558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853793.45564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.45566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853793.45568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.45617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853793.45620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853793.45698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853793.47369: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853793.47374: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853793.47440: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853793.47511: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp90lrxe23 /root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234/AnsiballZ_network_connections.py <<< 30583 1726853793.47518: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234/AnsiballZ_network_connections.py" <<< 30583 1726853793.47581: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp90lrxe23" to remote "/root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234/AnsiballZ_network_connections.py" <<< 30583 1726853793.48445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853793.48486: stderr chunk (state=3): >>><<< 30583 1726853793.48489: stdout chunk (state=3): >>><<< 30583 1726853793.48526: done transferring module to remote 30583 1726853793.48535: _low_level_execute_command(): starting 30583 1726853793.48540: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234/ /root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234/AnsiballZ_network_connections.py && sleep 0' 30583 1726853793.48953: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853793.48957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.48973: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.49024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853793.49031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853793.49096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853793.50990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853793.51013: stderr chunk (state=3): >>><<< 30583 1726853793.51016: stdout chunk (state=3): >>><<< 30583 1726853793.51028: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853793.51031: _low_level_execute_command(): starting 30583 1726853793.51036: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234/AnsiballZ_network_connections.py && sleep 0' 30583 1726853793.51457: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853793.51460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853793.51462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.51464: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853793.51466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.51517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853793.51528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853793.51600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853793.78773: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30583 1726853793.80715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853793.80742: stderr chunk (state=3): >>><<< 30583 1726853793.80746: stdout chunk (state=3): >>><<< 30583 1726853793.80761: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853793.80796: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853793.80804: _low_level_execute_command(): starting 30583 1726853793.80809: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853793.4206202-36035-249689808417234/ > /dev/null 2>&1 && sleep 0' 30583 1726853793.81260: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853793.81264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.81266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853793.81269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853793.81272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853793.81319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853793.81323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853793.81398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853793.83362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853793.83389: stderr chunk (state=3): >>><<< 30583 1726853793.83393: stdout chunk (state=3): >>><<< 30583 1726853793.83406: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853793.83412: handler run complete 30583 1726853793.83431: attempt loop complete, returning result 30583 1726853793.83434: _execute() done 30583 1726853793.83437: dumping result to json 30583 1726853793.83442: done dumping result, returning 30583 1726853793.83451: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-05ea-abc5-0000000026a4] 30583 1726853793.83455: sending task result for task 02083763-bbaf-05ea-abc5-0000000026a4 30583 1726853793.83563: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026a4 30583 1726853793.83574: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 30583 1726853793.83682: no more pending results, returning what we have 30583 1726853793.83686: results queue empty 30583 1726853793.83687: checking for any_errors_fatal 30583 1726853793.83695: done checking for any_errors_fatal 30583 1726853793.83695: checking for max_fail_percentage 30583 1726853793.83697: done checking for max_fail_percentage 30583 1726853793.83698: checking to see if all hosts have failed and the running result is not ok 30583 1726853793.83699: done checking to see if all hosts have failed 30583 1726853793.83699: getting the remaining hosts for this loop 30583 1726853793.83701: done getting the remaining hosts for this loop 30583 1726853793.83704: getting the next task for host managed_node2 30583 1726853793.83712: done getting next task for host managed_node2 30583 1726853793.83715: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853793.83720: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853793.83732: getting variables 30583 1726853793.83734: in VariableManager get_vars() 30583 1726853793.83779: Calling all_inventory to load vars for managed_node2 30583 1726853793.83782: Calling groups_inventory to load vars for managed_node2 30583 1726853793.83784: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853793.83793: Calling all_plugins_play to load vars for managed_node2 30583 1726853793.83795: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853793.83798: Calling groups_plugins_play to load vars for managed_node2 30583 1726853793.84641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853793.85643: done with get_vars() 30583 1726853793.85660: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:36:33 -0400 (0:00:00.536) 0:02:09.194 ****** 30583 1726853793.85723: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853793.85990: worker is 1 (out of 1 available) 30583 1726853793.86005: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30583 1726853793.86018: done queuing things up, now waiting for results queue to drain 30583 1726853793.86019: waiting for pending results... 30583 1726853793.86219: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30583 1726853793.86316: in run() - task 02083763-bbaf-05ea-abc5-0000000026a5 30583 1726853793.86328: variable 'ansible_search_path' from source: unknown 30583 1726853793.86332: variable 'ansible_search_path' from source: unknown 30583 1726853793.86365: calling self._execute() 30583 1726853793.86448: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.86453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.86466: variable 'omit' from source: magic vars 30583 1726853793.86759: variable 'ansible_distribution_major_version' from source: facts 30583 1726853793.86772: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853793.86858: variable 'network_state' from source: role '' defaults 30583 1726853793.86873: Evaluated conditional (network_state != {}): False 30583 1726853793.86876: when evaluation is False, skipping this task 30583 1726853793.86878: _execute() done 30583 1726853793.86881: dumping result to json 30583 1726853793.86883: done dumping result, returning 30583 1726853793.86892: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-05ea-abc5-0000000026a5] 30583 1726853793.86894: sending task result for task 02083763-bbaf-05ea-abc5-0000000026a5 30583 1726853793.86981: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026a5 30583 1726853793.86984: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30583 1726853793.87058: no more pending results, returning what we have 30583 1726853793.87063: results queue empty 30583 1726853793.87064: checking for any_errors_fatal 30583 1726853793.87082: done checking for any_errors_fatal 30583 1726853793.87083: checking for max_fail_percentage 30583 1726853793.87084: done checking for max_fail_percentage 30583 1726853793.87085: checking to see if all hosts have failed and the running result is not ok 30583 1726853793.87086: done checking to see if all hosts have failed 30583 1726853793.87086: getting the remaining hosts for this loop 30583 1726853793.87088: done getting the remaining hosts for this loop 30583 1726853793.87092: getting the next task for host managed_node2 30583 1726853793.87101: done getting next task for host managed_node2 30583 1726853793.87104: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853793.87109: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853793.87135: getting variables 30583 1726853793.87136: in VariableManager get_vars() 30583 1726853793.87182: Calling all_inventory to load vars for managed_node2 30583 1726853793.87185: Calling groups_inventory to load vars for managed_node2 30583 1726853793.87187: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853793.87195: Calling all_plugins_play to load vars for managed_node2 30583 1726853793.87197: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853793.87199: Calling groups_plugins_play to load vars for managed_node2 30583 1726853793.88001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853793.88868: done with get_vars() 30583 1726853793.88887: done getting variables 30583 1726853793.88929: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:36:33 -0400 (0:00:00.032) 0:02:09.226 ****** 30583 1726853793.88954: entering _queue_task() for managed_node2/debug 30583 1726853793.89205: worker is 1 (out of 1 available) 30583 1726853793.89218: exiting _queue_task() for managed_node2/debug 30583 1726853793.89230: done queuing things up, now waiting for results queue to drain 30583 1726853793.89232: waiting for pending results... 30583 1726853793.89431: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30583 1726853793.89541: in run() - task 02083763-bbaf-05ea-abc5-0000000026a6 30583 1726853793.89553: variable 'ansible_search_path' from source: unknown 30583 1726853793.89557: variable 'ansible_search_path' from source: unknown 30583 1726853793.89592: calling self._execute() 30583 1726853793.89674: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.89682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.89686: variable 'omit' from source: magic vars 30583 1726853793.89984: variable 'ansible_distribution_major_version' from source: facts 30583 1726853793.89994: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853793.90000: variable 'omit' from source: magic vars 30583 1726853793.90042: variable 'omit' from source: magic vars 30583 1726853793.90070: variable 'omit' from source: magic vars 30583 1726853793.90107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853793.90137: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853793.90152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853793.90168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853793.90179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853793.90203: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853793.90206: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.90209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.90283: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853793.90287: Set connection var ansible_timeout to 10 30583 1726853793.90290: Set connection var ansible_connection to ssh 30583 1726853793.90296: Set connection var ansible_shell_executable to /bin/sh 30583 1726853793.90298: Set connection var ansible_shell_type to sh 30583 1726853793.90306: Set connection var ansible_pipelining to False 30583 1726853793.90324: variable 'ansible_shell_executable' from source: unknown 30583 1726853793.90326: variable 'ansible_connection' from source: unknown 30583 1726853793.90329: variable 'ansible_module_compression' from source: unknown 30583 1726853793.90331: variable 'ansible_shell_type' from source: unknown 30583 1726853793.90333: variable 'ansible_shell_executable' from source: unknown 30583 1726853793.90335: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.90341: variable 'ansible_pipelining' from source: unknown 30583 1726853793.90343: variable 'ansible_timeout' from source: unknown 30583 1726853793.90345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.90448: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853793.90462: variable 'omit' from source: magic vars 30583 1726853793.90465: starting attempt loop 30583 1726853793.90467: running the handler 30583 1726853793.90561: variable '__network_connections_result' from source: set_fact 30583 1726853793.90607: handler run complete 30583 1726853793.90620: attempt loop complete, returning result 30583 1726853793.90623: _execute() done 30583 1726853793.90625: dumping result to json 30583 1726853793.90627: done dumping result, returning 30583 1726853793.90637: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-05ea-abc5-0000000026a6] 30583 1726853793.90640: sending task result for task 02083763-bbaf-05ea-abc5-0000000026a6 30583 1726853793.90725: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026a6 30583 1726853793.90728: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 30583 1726853793.90803: no more pending results, returning what we have 30583 1726853793.90807: results queue empty 30583 1726853793.90808: checking for any_errors_fatal 30583 1726853793.90815: done checking for any_errors_fatal 30583 1726853793.90815: checking for max_fail_percentage 30583 1726853793.90817: done checking for max_fail_percentage 30583 1726853793.90818: checking to see if all hosts have failed and the running result is not ok 30583 1726853793.90819: done checking to see if all hosts have failed 30583 1726853793.90819: getting the remaining hosts for this loop 30583 1726853793.90821: done getting the remaining hosts for this loop 30583 1726853793.90825: getting the next task for host managed_node2 30583 1726853793.90832: done getting next task for host managed_node2 30583 1726853793.90835: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853793.90841: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853793.90853: getting variables 30583 1726853793.90854: in VariableManager get_vars() 30583 1726853793.90902: Calling all_inventory to load vars for managed_node2 30583 1726853793.90905: Calling groups_inventory to load vars for managed_node2 30583 1726853793.90907: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853793.90916: Calling all_plugins_play to load vars for managed_node2 30583 1726853793.90919: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853793.90921: Calling groups_plugins_play to load vars for managed_node2 30583 1726853793.91878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853793.92728: done with get_vars() 30583 1726853793.92744: done getting variables 30583 1726853793.92790: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:36:33 -0400 (0:00:00.038) 0:02:09.265 ****** 30583 1726853793.92820: entering _queue_task() for managed_node2/debug 30583 1726853793.93054: worker is 1 (out of 1 available) 30583 1726853793.93068: exiting _queue_task() for managed_node2/debug 30583 1726853793.93084: done queuing things up, now waiting for results queue to drain 30583 1726853793.93085: waiting for pending results... 30583 1726853793.93283: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30583 1726853793.93388: in run() - task 02083763-bbaf-05ea-abc5-0000000026a7 30583 1726853793.93399: variable 'ansible_search_path' from source: unknown 30583 1726853793.93403: variable 'ansible_search_path' from source: unknown 30583 1726853793.93439: calling self._execute() 30583 1726853793.93520: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.93526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.93535: variable 'omit' from source: magic vars 30583 1726853793.93822: variable 'ansible_distribution_major_version' from source: facts 30583 1726853793.93841: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853793.93847: variable 'omit' from source: magic vars 30583 1726853793.93893: variable 'omit' from source: magic vars 30583 1726853793.93918: variable 'omit' from source: magic vars 30583 1726853793.93951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853793.93982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853793.93998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853793.94011: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853793.94021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853793.94044: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853793.94048: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.94050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.94122: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853793.94126: Set connection var ansible_timeout to 10 30583 1726853793.94128: Set connection var ansible_connection to ssh 30583 1726853793.94134: Set connection var ansible_shell_executable to /bin/sh 30583 1726853793.94137: Set connection var ansible_shell_type to sh 30583 1726853793.94144: Set connection var ansible_pipelining to False 30583 1726853793.94163: variable 'ansible_shell_executable' from source: unknown 30583 1726853793.94166: variable 'ansible_connection' from source: unknown 30583 1726853793.94169: variable 'ansible_module_compression' from source: unknown 30583 1726853793.94174: variable 'ansible_shell_type' from source: unknown 30583 1726853793.94176: variable 'ansible_shell_executable' from source: unknown 30583 1726853793.94178: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.94182: variable 'ansible_pipelining' from source: unknown 30583 1726853793.94185: variable 'ansible_timeout' from source: unknown 30583 1726853793.94187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.94286: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853793.94298: variable 'omit' from source: magic vars 30583 1726853793.94301: starting attempt loop 30583 1726853793.94303: running the handler 30583 1726853793.94343: variable '__network_connections_result' from source: set_fact 30583 1726853793.94401: variable '__network_connections_result' from source: set_fact 30583 1726853793.94479: handler run complete 30583 1726853793.94495: attempt loop complete, returning result 30583 1726853793.94498: _execute() done 30583 1726853793.94501: dumping result to json 30583 1726853793.94503: done dumping result, returning 30583 1726853793.94512: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-05ea-abc5-0000000026a7] 30583 1726853793.94514: sending task result for task 02083763-bbaf-05ea-abc5-0000000026a7 30583 1726853793.94606: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026a7 30583 1726853793.94609: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 30583 1726853793.94716: no more pending results, returning what we have 30583 1726853793.94719: results queue empty 30583 1726853793.94720: checking for any_errors_fatal 30583 1726853793.94725: done checking for any_errors_fatal 30583 1726853793.94726: checking for max_fail_percentage 30583 1726853793.94727: done checking for max_fail_percentage 30583 1726853793.94728: checking to see if all hosts have failed and the running result is not ok 30583 1726853793.94728: done checking to see if all hosts have failed 30583 1726853793.94729: getting the remaining hosts for this loop 30583 1726853793.94731: done getting the remaining hosts for this loop 30583 1726853793.94734: getting the next task for host managed_node2 30583 1726853793.94741: done getting next task for host managed_node2 30583 1726853793.94744: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853793.94749: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853793.94763: getting variables 30583 1726853793.94765: in VariableManager get_vars() 30583 1726853793.94801: Calling all_inventory to load vars for managed_node2 30583 1726853793.94804: Calling groups_inventory to load vars for managed_node2 30583 1726853793.94806: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853793.94819: Calling all_plugins_play to load vars for managed_node2 30583 1726853793.94822: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853793.94824: Calling groups_plugins_play to load vars for managed_node2 30583 1726853793.95591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853793.96460: done with get_vars() 30583 1726853793.96478: done getting variables 30583 1726853793.96517: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:36:33 -0400 (0:00:00.037) 0:02:09.302 ****** 30583 1726853793.96540: entering _queue_task() for managed_node2/debug 30583 1726853793.96770: worker is 1 (out of 1 available) 30583 1726853793.96785: exiting _queue_task() for managed_node2/debug 30583 1726853793.96799: done queuing things up, now waiting for results queue to drain 30583 1726853793.96800: waiting for pending results... 30583 1726853793.96986: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30583 1726853793.97081: in run() - task 02083763-bbaf-05ea-abc5-0000000026a8 30583 1726853793.97092: variable 'ansible_search_path' from source: unknown 30583 1726853793.97095: variable 'ansible_search_path' from source: unknown 30583 1726853793.97122: calling self._execute() 30583 1726853793.97203: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853793.97207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853793.97216: variable 'omit' from source: magic vars 30583 1726853793.97495: variable 'ansible_distribution_major_version' from source: facts 30583 1726853793.97504: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853793.97586: variable 'network_state' from source: role '' defaults 30583 1726853793.97595: Evaluated conditional (network_state != {}): False 30583 1726853793.97597: when evaluation is False, skipping this task 30583 1726853793.97600: _execute() done 30583 1726853793.97602: dumping result to json 30583 1726853793.97605: done dumping result, returning 30583 1726853793.97614: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-05ea-abc5-0000000026a8] 30583 1726853793.97617: sending task result for task 02083763-bbaf-05ea-abc5-0000000026a8 30583 1726853793.97704: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026a8 30583 1726853793.97707: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30583 1726853793.97751: no more pending results, returning what we have 30583 1726853793.97755: results queue empty 30583 1726853793.97756: checking for any_errors_fatal 30583 1726853793.97767: done checking for any_errors_fatal 30583 1726853793.97768: checking for max_fail_percentage 30583 1726853793.97770: done checking for max_fail_percentage 30583 1726853793.97772: checking to see if all hosts have failed and the running result is not ok 30583 1726853793.97773: done checking to see if all hosts have failed 30583 1726853793.97774: getting the remaining hosts for this loop 30583 1726853793.97775: done getting the remaining hosts for this loop 30583 1726853793.97779: getting the next task for host managed_node2 30583 1726853793.97786: done getting next task for host managed_node2 30583 1726853793.97790: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853793.97794: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853793.97817: getting variables 30583 1726853793.97818: in VariableManager get_vars() 30583 1726853793.97855: Calling all_inventory to load vars for managed_node2 30583 1726853793.97860: Calling groups_inventory to load vars for managed_node2 30583 1726853793.97862: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853793.97870: Calling all_plugins_play to load vars for managed_node2 30583 1726853793.97877: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853793.97880: Calling groups_plugins_play to load vars for managed_node2 30583 1726853793.98770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853793.99622: done with get_vars() 30583 1726853793.99637: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:36:33 -0400 (0:00:00.031) 0:02:09.334 ****** 30583 1726853793.99705: entering _queue_task() for managed_node2/ping 30583 1726853793.99935: worker is 1 (out of 1 available) 30583 1726853793.99949: exiting _queue_task() for managed_node2/ping 30583 1726853793.99966: done queuing things up, now waiting for results queue to drain 30583 1726853793.99967: waiting for pending results... 30583 1726853794.00152: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30583 1726853794.00246: in run() - task 02083763-bbaf-05ea-abc5-0000000026a9 30583 1726853794.00260: variable 'ansible_search_path' from source: unknown 30583 1726853794.00264: variable 'ansible_search_path' from source: unknown 30583 1726853794.00290: calling self._execute() 30583 1726853794.00373: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.00376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.00384: variable 'omit' from source: magic vars 30583 1726853794.00664: variable 'ansible_distribution_major_version' from source: facts 30583 1726853794.00673: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853794.00680: variable 'omit' from source: magic vars 30583 1726853794.00720: variable 'omit' from source: magic vars 30583 1726853794.00745: variable 'omit' from source: magic vars 30583 1726853794.00778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853794.00804: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853794.00820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853794.00832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853794.00842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853794.00868: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853794.00873: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.00875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.00943: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853794.00948: Set connection var ansible_timeout to 10 30583 1726853794.00951: Set connection var ansible_connection to ssh 30583 1726853794.00955: Set connection var ansible_shell_executable to /bin/sh 30583 1726853794.00965: Set connection var ansible_shell_type to sh 30583 1726853794.00967: Set connection var ansible_pipelining to False 30583 1726853794.00988: variable 'ansible_shell_executable' from source: unknown 30583 1726853794.00991: variable 'ansible_connection' from source: unknown 30583 1726853794.00994: variable 'ansible_module_compression' from source: unknown 30583 1726853794.00996: variable 'ansible_shell_type' from source: unknown 30583 1726853794.00998: variable 'ansible_shell_executable' from source: unknown 30583 1726853794.01001: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.01003: variable 'ansible_pipelining' from source: unknown 30583 1726853794.01005: variable 'ansible_timeout' from source: unknown 30583 1726853794.01009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.01152: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853794.01163: variable 'omit' from source: magic vars 30583 1726853794.01166: starting attempt loop 30583 1726853794.01168: running the handler 30583 1726853794.01183: _low_level_execute_command(): starting 30583 1726853794.01189: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853794.01687: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.01691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853794.01695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.01745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853794.01748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853794.01750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.01837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853794.03595: stdout chunk (state=3): >>>/root <<< 30583 1726853794.03694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853794.03720: stderr chunk (state=3): >>><<< 30583 1726853794.03724: stdout chunk (state=3): >>><<< 30583 1726853794.03742: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853794.03755: _low_level_execute_command(): starting 30583 1726853794.03762: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613 `" && echo ansible-tmp-1726853794.0374196-36050-96433014222613="` echo /root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613 `" ) && sleep 0' 30583 1726853794.04190: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.04193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853794.04196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.04205: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853794.04208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853794.04210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.04252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853794.04258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853794.04260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.04331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853794.06381: stdout chunk (state=3): >>>ansible-tmp-1726853794.0374196-36050-96433014222613=/root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613 <<< 30583 1726853794.06492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853794.06516: stderr chunk (state=3): >>><<< 30583 1726853794.06520: stdout chunk (state=3): >>><<< 30583 1726853794.06533: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853794.0374196-36050-96433014222613=/root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853794.06574: variable 'ansible_module_compression' from source: unknown 30583 1726853794.06606: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30583 1726853794.06636: variable 'ansible_facts' from source: unknown 30583 1726853794.06694: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613/AnsiballZ_ping.py 30583 1726853794.06787: Sending initial data 30583 1726853794.06790: Sent initial data (152 bytes) 30583 1726853794.07211: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853794.07214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853794.07216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.07219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.07221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.07273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853794.07280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.07350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853794.09049: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30583 1726853794.09053: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853794.09114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853794.09190: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp2tj4pzsk /root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613/AnsiballZ_ping.py <<< 30583 1726853794.09196: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613/AnsiballZ_ping.py" <<< 30583 1726853794.09261: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp2tj4pzsk" to remote "/root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613/AnsiballZ_ping.py" <<< 30583 1726853794.09264: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613/AnsiballZ_ping.py" <<< 30583 1726853794.09900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853794.09939: stderr chunk (state=3): >>><<< 30583 1726853794.09942: stdout chunk (state=3): >>><<< 30583 1726853794.09974: done transferring module to remote 30583 1726853794.09982: _low_level_execute_command(): starting 30583 1726853794.09987: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613/ /root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613/AnsiballZ_ping.py && sleep 0' 30583 1726853794.10408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853794.10411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853794.10414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.10416: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.10423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.10478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853794.10482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.10547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853794.12473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853794.12498: stderr chunk (state=3): >>><<< 30583 1726853794.12501: stdout chunk (state=3): >>><<< 30583 1726853794.12517: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853794.12521: _low_level_execute_command(): starting 30583 1726853794.12525: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613/AnsiballZ_ping.py && sleep 0' 30583 1726853794.12955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.12958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853794.12960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853794.12964: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853794.12966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.13014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853794.13017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.13103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853794.28700: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30583 1726853794.30095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853794.30123: stderr chunk (state=3): >>><<< 30583 1726853794.30126: stdout chunk (state=3): >>><<< 30583 1726853794.30140: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853794.30169: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853794.30180: _low_level_execute_command(): starting 30583 1726853794.30185: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853794.0374196-36050-96433014222613/ > /dev/null 2>&1 && sleep 0' 30583 1726853794.30631: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853794.30636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.30638: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30583 1726853794.30640: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853794.30642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.30697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853794.30704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853794.30709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.30775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853794.32676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853794.32703: stderr chunk (state=3): >>><<< 30583 1726853794.32706: stdout chunk (state=3): >>><<< 30583 1726853794.32720: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853794.32728: handler run complete 30583 1726853794.32739: attempt loop complete, returning result 30583 1726853794.32742: _execute() done 30583 1726853794.32744: dumping result to json 30583 1726853794.32746: done dumping result, returning 30583 1726853794.32754: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-05ea-abc5-0000000026a9] 30583 1726853794.32758: sending task result for task 02083763-bbaf-05ea-abc5-0000000026a9 30583 1726853794.32849: done sending task result for task 02083763-bbaf-05ea-abc5-0000000026a9 30583 1726853794.32851: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30583 1726853794.32922: no more pending results, returning what we have 30583 1726853794.32925: results queue empty 30583 1726853794.32926: checking for any_errors_fatal 30583 1726853794.32934: done checking for any_errors_fatal 30583 1726853794.32935: checking for max_fail_percentage 30583 1726853794.32937: done checking for max_fail_percentage 30583 1726853794.32938: checking to see if all hosts have failed and the running result is not ok 30583 1726853794.32939: done checking to see if all hosts have failed 30583 1726853794.32940: getting the remaining hosts for this loop 30583 1726853794.32942: done getting the remaining hosts for this loop 30583 1726853794.32945: getting the next task for host managed_node2 30583 1726853794.32957: done getting next task for host managed_node2 30583 1726853794.32959: ^ task is: TASK: meta (role_complete) 30583 1726853794.32966: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853794.32981: getting variables 30583 1726853794.32983: in VariableManager get_vars() 30583 1726853794.33030: Calling all_inventory to load vars for managed_node2 30583 1726853794.33033: Calling groups_inventory to load vars for managed_node2 30583 1726853794.33036: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853794.33045: Calling all_plugins_play to load vars for managed_node2 30583 1726853794.33048: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853794.33050: Calling groups_plugins_play to load vars for managed_node2 30583 1726853794.33889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853794.34741: done with get_vars() 30583 1726853794.34758: done getting variables 30583 1726853794.34820: done queuing things up, now waiting for results queue to drain 30583 1726853794.34822: results queue empty 30583 1726853794.34822: checking for any_errors_fatal 30583 1726853794.34824: done checking for any_errors_fatal 30583 1726853794.34825: checking for max_fail_percentage 30583 1726853794.34825: done checking for max_fail_percentage 30583 1726853794.34826: checking to see if all hosts have failed and the running result is not ok 30583 1726853794.34826: done checking to see if all hosts have failed 30583 1726853794.34827: getting the remaining hosts for this loop 30583 1726853794.34827: done getting the remaining hosts for this loop 30583 1726853794.34829: getting the next task for host managed_node2 30583 1726853794.34832: done getting next task for host managed_node2 30583 1726853794.34834: ^ task is: TASK: Asserts 30583 1726853794.34835: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853794.34837: getting variables 30583 1726853794.34838: in VariableManager get_vars() 30583 1726853794.34846: Calling all_inventory to load vars for managed_node2 30583 1726853794.34847: Calling groups_inventory to load vars for managed_node2 30583 1726853794.34849: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853794.34852: Calling all_plugins_play to load vars for managed_node2 30583 1726853794.34854: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853794.34855: Calling groups_plugins_play to load vars for managed_node2 30583 1726853794.35556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853794.36386: done with get_vars() 30583 1726853794.36401: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 13:36:34 -0400 (0:00:00.367) 0:02:09.701 ****** 30583 1726853794.36455: entering _queue_task() for managed_node2/include_tasks 30583 1726853794.36766: worker is 1 (out of 1 available) 30583 1726853794.36781: exiting _queue_task() for managed_node2/include_tasks 30583 1726853794.36794: done queuing things up, now waiting for results queue to drain 30583 1726853794.36796: waiting for pending results... 30583 1726853794.37004: running TaskExecutor() for managed_node2/TASK: Asserts 30583 1726853794.37092: in run() - task 02083763-bbaf-05ea-abc5-0000000020b2 30583 1726853794.37103: variable 'ansible_search_path' from source: unknown 30583 1726853794.37106: variable 'ansible_search_path' from source: unknown 30583 1726853794.37145: variable 'lsr_assert' from source: include params 30583 1726853794.37321: variable 'lsr_assert' from source: include params 30583 1726853794.37384: variable 'omit' from source: magic vars 30583 1726853794.37487: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.37494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.37503: variable 'omit' from source: magic vars 30583 1726853794.37675: variable 'ansible_distribution_major_version' from source: facts 30583 1726853794.37683: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853794.37689: variable 'item' from source: unknown 30583 1726853794.37733: variable 'item' from source: unknown 30583 1726853794.37756: variable 'item' from source: unknown 30583 1726853794.37803: variable 'item' from source: unknown 30583 1726853794.37933: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.37936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.37939: variable 'omit' from source: magic vars 30583 1726853794.38012: variable 'ansible_distribution_major_version' from source: facts 30583 1726853794.38016: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853794.38022: variable 'item' from source: unknown 30583 1726853794.38068: variable 'item' from source: unknown 30583 1726853794.38090: variable 'item' from source: unknown 30583 1726853794.38133: variable 'item' from source: unknown 30583 1726853794.38203: dumping result to json 30583 1726853794.38205: done dumping result, returning 30583 1726853794.38207: done running TaskExecutor() for managed_node2/TASK: Asserts [02083763-bbaf-05ea-abc5-0000000020b2] 30583 1726853794.38209: sending task result for task 02083763-bbaf-05ea-abc5-0000000020b2 30583 1726853794.38241: done sending task result for task 02083763-bbaf-05ea-abc5-0000000020b2 30583 1726853794.38243: WORKER PROCESS EXITING 30583 1726853794.38329: no more pending results, returning what we have 30583 1726853794.38334: in VariableManager get_vars() 30583 1726853794.38379: Calling all_inventory to load vars for managed_node2 30583 1726853794.38382: Calling groups_inventory to load vars for managed_node2 30583 1726853794.38385: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853794.38398: Calling all_plugins_play to load vars for managed_node2 30583 1726853794.38401: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853794.38403: Calling groups_plugins_play to load vars for managed_node2 30583 1726853794.39191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853794.40132: done with get_vars() 30583 1726853794.40145: variable 'ansible_search_path' from source: unknown 30583 1726853794.40145: variable 'ansible_search_path' from source: unknown 30583 1726853794.40176: variable 'ansible_search_path' from source: unknown 30583 1726853794.40177: variable 'ansible_search_path' from source: unknown 30583 1726853794.40194: we have included files to process 30583 1726853794.40194: generating all_blocks data 30583 1726853794.40196: done generating all_blocks data 30583 1726853794.40202: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30583 1726853794.40203: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30583 1726853794.40204: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30583 1726853794.40278: in VariableManager get_vars() 30583 1726853794.40294: done with get_vars() 30583 1726853794.40368: done processing included file 30583 1726853794.40370: iterating over new_blocks loaded from include file 30583 1726853794.40372: in VariableManager get_vars() 30583 1726853794.40383: done with get_vars() 30583 1726853794.40384: filtering new block on tags 30583 1726853794.40405: done filtering new block on tags 30583 1726853794.40407: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 => (item=tasks/assert_profile_absent.yml) 30583 1726853794.40410: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30583 1726853794.40410: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30583 1726853794.40413: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30583 1726853794.40648: done processing included file 30583 1726853794.40650: iterating over new_blocks loaded from include file 30583 1726853794.40651: in VariableManager get_vars() 30583 1726853794.40661: done with get_vars() 30583 1726853794.40662: filtering new block on tags 30583 1726853794.40690: done filtering new block on tags 30583 1726853794.40691: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml for managed_node2 => (item=tasks/get_NetworkManager_NVR.yml) 30583 1726853794.40694: extending task lists for all hosts with included blocks 30583 1726853794.41309: done extending task lists 30583 1726853794.41310: done processing included files 30583 1726853794.41311: results queue empty 30583 1726853794.41311: checking for any_errors_fatal 30583 1726853794.41312: done checking for any_errors_fatal 30583 1726853794.41313: checking for max_fail_percentage 30583 1726853794.41314: done checking for max_fail_percentage 30583 1726853794.41315: checking to see if all hosts have failed and the running result is not ok 30583 1726853794.41315: done checking to see if all hosts have failed 30583 1726853794.41315: getting the remaining hosts for this loop 30583 1726853794.41316: done getting the remaining hosts for this loop 30583 1726853794.41318: getting the next task for host managed_node2 30583 1726853794.41321: done getting next task for host managed_node2 30583 1726853794.41322: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30583 1726853794.41324: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853794.41326: getting variables 30583 1726853794.41330: in VariableManager get_vars() 30583 1726853794.41338: Calling all_inventory to load vars for managed_node2 30583 1726853794.41339: Calling groups_inventory to load vars for managed_node2 30583 1726853794.41341: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853794.41344: Calling all_plugins_play to load vars for managed_node2 30583 1726853794.41346: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853794.41347: Calling groups_plugins_play to load vars for managed_node2 30583 1726853794.41972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853794.42797: done with get_vars() 30583 1726853794.42811: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 13:36:34 -0400 (0:00:00.064) 0:02:09.765 ****** 30583 1726853794.42857: entering _queue_task() for managed_node2/include_tasks 30583 1726853794.43120: worker is 1 (out of 1 available) 30583 1726853794.43134: exiting _queue_task() for managed_node2/include_tasks 30583 1726853794.43148: done queuing things up, now waiting for results queue to drain 30583 1726853794.43149: waiting for pending results... 30583 1726853794.43344: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 30583 1726853794.43423: in run() - task 02083763-bbaf-05ea-abc5-000000002804 30583 1726853794.43432: variable 'ansible_search_path' from source: unknown 30583 1726853794.43436: variable 'ansible_search_path' from source: unknown 30583 1726853794.43466: calling self._execute() 30583 1726853794.43546: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.43549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.43559: variable 'omit' from source: magic vars 30583 1726853794.43849: variable 'ansible_distribution_major_version' from source: facts 30583 1726853794.43858: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853794.43866: _execute() done 30583 1726853794.43869: dumping result to json 30583 1726853794.43873: done dumping result, returning 30583 1726853794.43881: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-05ea-abc5-000000002804] 30583 1726853794.43885: sending task result for task 02083763-bbaf-05ea-abc5-000000002804 30583 1726853794.43966: done sending task result for task 02083763-bbaf-05ea-abc5-000000002804 30583 1726853794.43969: WORKER PROCESS EXITING 30583 1726853794.43997: no more pending results, returning what we have 30583 1726853794.44002: in VariableManager get_vars() 30583 1726853794.44052: Calling all_inventory to load vars for managed_node2 30583 1726853794.44055: Calling groups_inventory to load vars for managed_node2 30583 1726853794.44058: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853794.44069: Calling all_plugins_play to load vars for managed_node2 30583 1726853794.44074: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853794.44077: Calling groups_plugins_play to load vars for managed_node2 30583 1726853794.49042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853794.49874: done with get_vars() 30583 1726853794.49890: variable 'ansible_search_path' from source: unknown 30583 1726853794.49891: variable 'ansible_search_path' from source: unknown 30583 1726853794.49898: variable 'item' from source: include params 30583 1726853794.49961: variable 'item' from source: include params 30583 1726853794.49986: we have included files to process 30583 1726853794.49986: generating all_blocks data 30583 1726853794.49987: done generating all_blocks data 30583 1726853794.49988: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853794.49989: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853794.49990: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30583 1726853794.50551: done processing included file 30583 1726853794.50552: iterating over new_blocks loaded from include file 30583 1726853794.50553: in VariableManager get_vars() 30583 1726853794.50567: done with get_vars() 30583 1726853794.50569: filtering new block on tags 30583 1726853794.50608: done filtering new block on tags 30583 1726853794.50610: in VariableManager get_vars() 30583 1726853794.50619: done with get_vars() 30583 1726853794.50620: filtering new block on tags 30583 1726853794.50650: done filtering new block on tags 30583 1726853794.50651: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 30583 1726853794.50654: extending task lists for all hosts with included blocks 30583 1726853794.50792: done extending task lists 30583 1726853794.50793: done processing included files 30583 1726853794.50794: results queue empty 30583 1726853794.50794: checking for any_errors_fatal 30583 1726853794.50796: done checking for any_errors_fatal 30583 1726853794.50797: checking for max_fail_percentage 30583 1726853794.50798: done checking for max_fail_percentage 30583 1726853794.50798: checking to see if all hosts have failed and the running result is not ok 30583 1726853794.50799: done checking to see if all hosts have failed 30583 1726853794.50799: getting the remaining hosts for this loop 30583 1726853794.50800: done getting the remaining hosts for this loop 30583 1726853794.50801: getting the next task for host managed_node2 30583 1726853794.50804: done getting next task for host managed_node2 30583 1726853794.50806: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30583 1726853794.50807: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853794.50809: getting variables 30583 1726853794.50809: in VariableManager get_vars() 30583 1726853794.50817: Calling all_inventory to load vars for managed_node2 30583 1726853794.50818: Calling groups_inventory to load vars for managed_node2 30583 1726853794.50820: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853794.50823: Calling all_plugins_play to load vars for managed_node2 30583 1726853794.50825: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853794.50827: Calling groups_plugins_play to load vars for managed_node2 30583 1726853794.51436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853794.52275: done with get_vars() 30583 1726853794.52293: done getting variables 30583 1726853794.52323: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:36:34 -0400 (0:00:00.094) 0:02:09.860 ****** 30583 1726853794.52343: entering _queue_task() for managed_node2/set_fact 30583 1726853794.52627: worker is 1 (out of 1 available) 30583 1726853794.52645: exiting _queue_task() for managed_node2/set_fact 30583 1726853794.52657: done queuing things up, now waiting for results queue to drain 30583 1726853794.52658: waiting for pending results... 30583 1726853794.52851: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 30583 1726853794.52937: in run() - task 02083763-bbaf-05ea-abc5-000000002888 30583 1726853794.52948: variable 'ansible_search_path' from source: unknown 30583 1726853794.52951: variable 'ansible_search_path' from source: unknown 30583 1726853794.52988: calling self._execute() 30583 1726853794.53070: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.53075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.53084: variable 'omit' from source: magic vars 30583 1726853794.53373: variable 'ansible_distribution_major_version' from source: facts 30583 1726853794.53383: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853794.53390: variable 'omit' from source: magic vars 30583 1726853794.53425: variable 'omit' from source: magic vars 30583 1726853794.53451: variable 'omit' from source: magic vars 30583 1726853794.53486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853794.53513: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853794.53530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853794.53545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853794.53555: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853794.53585: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853794.53589: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.53591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.53659: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853794.53667: Set connection var ansible_timeout to 10 30583 1726853794.53672: Set connection var ansible_connection to ssh 30583 1726853794.53675: Set connection var ansible_shell_executable to /bin/sh 30583 1726853794.53679: Set connection var ansible_shell_type to sh 30583 1726853794.53687: Set connection var ansible_pipelining to False 30583 1726853794.53704: variable 'ansible_shell_executable' from source: unknown 30583 1726853794.53708: variable 'ansible_connection' from source: unknown 30583 1726853794.53711: variable 'ansible_module_compression' from source: unknown 30583 1726853794.53713: variable 'ansible_shell_type' from source: unknown 30583 1726853794.53716: variable 'ansible_shell_executable' from source: unknown 30583 1726853794.53718: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.53721: variable 'ansible_pipelining' from source: unknown 30583 1726853794.53723: variable 'ansible_timeout' from source: unknown 30583 1726853794.53725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.53829: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853794.53839: variable 'omit' from source: magic vars 30583 1726853794.53845: starting attempt loop 30583 1726853794.53848: running the handler 30583 1726853794.53859: handler run complete 30583 1726853794.53872: attempt loop complete, returning result 30583 1726853794.53876: _execute() done 30583 1726853794.53879: dumping result to json 30583 1726853794.53881: done dumping result, returning 30583 1726853794.53887: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-05ea-abc5-000000002888] 30583 1726853794.53890: sending task result for task 02083763-bbaf-05ea-abc5-000000002888 30583 1726853794.53965: done sending task result for task 02083763-bbaf-05ea-abc5-000000002888 30583 1726853794.53968: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30583 1726853794.54022: no more pending results, returning what we have 30583 1726853794.54026: results queue empty 30583 1726853794.54027: checking for any_errors_fatal 30583 1726853794.54029: done checking for any_errors_fatal 30583 1726853794.54030: checking for max_fail_percentage 30583 1726853794.54032: done checking for max_fail_percentage 30583 1726853794.54033: checking to see if all hosts have failed and the running result is not ok 30583 1726853794.54033: done checking to see if all hosts have failed 30583 1726853794.54034: getting the remaining hosts for this loop 30583 1726853794.54036: done getting the remaining hosts for this loop 30583 1726853794.54039: getting the next task for host managed_node2 30583 1726853794.54048: done getting next task for host managed_node2 30583 1726853794.54050: ^ task is: TASK: Stat profile file 30583 1726853794.54056: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853794.54060: getting variables 30583 1726853794.54062: in VariableManager get_vars() 30583 1726853794.54107: Calling all_inventory to load vars for managed_node2 30583 1726853794.54110: Calling groups_inventory to load vars for managed_node2 30583 1726853794.54113: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853794.54123: Calling all_plugins_play to load vars for managed_node2 30583 1726853794.54125: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853794.54128: Calling groups_plugins_play to load vars for managed_node2 30583 1726853794.54977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853794.55838: done with get_vars() 30583 1726853794.55852: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:36:34 -0400 (0:00:00.035) 0:02:09.896 ****** 30583 1726853794.55920: entering _queue_task() for managed_node2/stat 30583 1726853794.56153: worker is 1 (out of 1 available) 30583 1726853794.56166: exiting _queue_task() for managed_node2/stat 30583 1726853794.56180: done queuing things up, now waiting for results queue to drain 30583 1726853794.56182: waiting for pending results... 30583 1726853794.56364: running TaskExecutor() for managed_node2/TASK: Stat profile file 30583 1726853794.56453: in run() - task 02083763-bbaf-05ea-abc5-000000002889 30583 1726853794.56466: variable 'ansible_search_path' from source: unknown 30583 1726853794.56472: variable 'ansible_search_path' from source: unknown 30583 1726853794.56501: calling self._execute() 30583 1726853794.56579: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.56583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.56592: variable 'omit' from source: magic vars 30583 1726853794.56876: variable 'ansible_distribution_major_version' from source: facts 30583 1726853794.56885: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853794.56891: variable 'omit' from source: magic vars 30583 1726853794.56924: variable 'omit' from source: magic vars 30583 1726853794.56998: variable 'profile' from source: play vars 30583 1726853794.57002: variable 'interface' from source: play vars 30583 1726853794.57050: variable 'interface' from source: play vars 30583 1726853794.57070: variable 'omit' from source: magic vars 30583 1726853794.57102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853794.57128: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853794.57145: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853794.57158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853794.57174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853794.57199: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853794.57203: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.57207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.57275: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853794.57279: Set connection var ansible_timeout to 10 30583 1726853794.57283: Set connection var ansible_connection to ssh 30583 1726853794.57291: Set connection var ansible_shell_executable to /bin/sh 30583 1726853794.57294: Set connection var ansible_shell_type to sh 30583 1726853794.57300: Set connection var ansible_pipelining to False 30583 1726853794.57318: variable 'ansible_shell_executable' from source: unknown 30583 1726853794.57321: variable 'ansible_connection' from source: unknown 30583 1726853794.57324: variable 'ansible_module_compression' from source: unknown 30583 1726853794.57326: variable 'ansible_shell_type' from source: unknown 30583 1726853794.57328: variable 'ansible_shell_executable' from source: unknown 30583 1726853794.57330: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.57332: variable 'ansible_pipelining' from source: unknown 30583 1726853794.57335: variable 'ansible_timeout' from source: unknown 30583 1726853794.57338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.57485: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853794.57494: variable 'omit' from source: magic vars 30583 1726853794.57499: starting attempt loop 30583 1726853794.57504: running the handler 30583 1726853794.57515: _low_level_execute_command(): starting 30583 1726853794.57521: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853794.58026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.58029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.58032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.58034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.58092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853794.58095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853794.58098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.58182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853794.59928: stdout chunk (state=3): >>>/root <<< 30583 1726853794.60027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853794.60053: stderr chunk (state=3): >>><<< 30583 1726853794.60057: stdout chunk (state=3): >>><<< 30583 1726853794.60080: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853794.60092: _low_level_execute_command(): starting 30583 1726853794.60097: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102 `" && echo ansible-tmp-1726853794.6007771-36061-108798335668102="` echo /root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102 `" ) && sleep 0' 30583 1726853794.60513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.60519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853794.60528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.60531: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.60533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.60578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853794.60585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.60659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853794.62701: stdout chunk (state=3): >>>ansible-tmp-1726853794.6007771-36061-108798335668102=/root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102 <<< 30583 1726853794.62806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853794.62829: stderr chunk (state=3): >>><<< 30583 1726853794.62832: stdout chunk (state=3): >>><<< 30583 1726853794.62846: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853794.6007771-36061-108798335668102=/root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853794.62884: variable 'ansible_module_compression' from source: unknown 30583 1726853794.62936: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853794.62967: variable 'ansible_facts' from source: unknown 30583 1726853794.63033: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102/AnsiballZ_stat.py 30583 1726853794.63128: Sending initial data 30583 1726853794.63131: Sent initial data (153 bytes) 30583 1726853794.63574: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853794.63578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853794.63580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.63582: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.63584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.63620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853794.63634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.63708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853794.65385: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853794.65389: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853794.65452: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853794.65527: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpax90ws_8 /root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102/AnsiballZ_stat.py <<< 30583 1726853794.65530: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102/AnsiballZ_stat.py" <<< 30583 1726853794.65595: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpax90ws_8" to remote "/root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102/AnsiballZ_stat.py" <<< 30583 1726853794.66282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853794.66288: stderr chunk (state=3): >>><<< 30583 1726853794.66290: stdout chunk (state=3): >>><<< 30583 1726853794.66329: done transferring module to remote 30583 1726853794.66338: _low_level_execute_command(): starting 30583 1726853794.66343: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102/ /root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102/AnsiballZ_stat.py && sleep 0' 30583 1726853794.66777: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853794.66781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853794.66783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.66785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853794.66792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.66838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853794.66844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.66911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853794.68821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853794.68843: stderr chunk (state=3): >>><<< 30583 1726853794.68846: stdout chunk (state=3): >>><<< 30583 1726853794.68863: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853794.68867: _low_level_execute_command(): starting 30583 1726853794.68870: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102/AnsiballZ_stat.py && sleep 0' 30583 1726853794.69285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853794.69290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853794.69292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.69295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853794.69296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853794.69299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.69342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853794.69345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.69428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853794.85529: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853794.86970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853794.86994: stderr chunk (state=3): >>><<< 30583 1726853794.86997: stdout chunk (state=3): >>><<< 30583 1726853794.87012: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853794.87040: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853794.87047: _low_level_execute_command(): starting 30583 1726853794.87051: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853794.6007771-36061-108798335668102/ > /dev/null 2>&1 && sleep 0' 30583 1726853794.87487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.87491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.87493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.87495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.87547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853794.87551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853794.87556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.87626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853794.89554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853794.89582: stderr chunk (state=3): >>><<< 30583 1726853794.89585: stdout chunk (state=3): >>><<< 30583 1726853794.89597: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853794.89603: handler run complete 30583 1726853794.89619: attempt loop complete, returning result 30583 1726853794.89622: _execute() done 30583 1726853794.89624: dumping result to json 30583 1726853794.89626: done dumping result, returning 30583 1726853794.89635: done running TaskExecutor() for managed_node2/TASK: Stat profile file [02083763-bbaf-05ea-abc5-000000002889] 30583 1726853794.89639: sending task result for task 02083763-bbaf-05ea-abc5-000000002889 30583 1726853794.89732: done sending task result for task 02083763-bbaf-05ea-abc5-000000002889 30583 1726853794.89736: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30583 1726853794.89804: no more pending results, returning what we have 30583 1726853794.89808: results queue empty 30583 1726853794.89809: checking for any_errors_fatal 30583 1726853794.89816: done checking for any_errors_fatal 30583 1726853794.89817: checking for max_fail_percentage 30583 1726853794.89819: done checking for max_fail_percentage 30583 1726853794.89820: checking to see if all hosts have failed and the running result is not ok 30583 1726853794.89820: done checking to see if all hosts have failed 30583 1726853794.89821: getting the remaining hosts for this loop 30583 1726853794.89823: done getting the remaining hosts for this loop 30583 1726853794.89826: getting the next task for host managed_node2 30583 1726853794.89835: done getting next task for host managed_node2 30583 1726853794.89838: ^ task is: TASK: Set NM profile exist flag based on the profile files 30583 1726853794.89843: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853794.89848: getting variables 30583 1726853794.89852: in VariableManager get_vars() 30583 1726853794.89897: Calling all_inventory to load vars for managed_node2 30583 1726853794.89900: Calling groups_inventory to load vars for managed_node2 30583 1726853794.89903: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853794.89913: Calling all_plugins_play to load vars for managed_node2 30583 1726853794.89916: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853794.89918: Calling groups_plugins_play to load vars for managed_node2 30583 1726853794.90741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853794.91724: done with get_vars() 30583 1726853794.91739: done getting variables 30583 1726853794.91786: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:36:34 -0400 (0:00:00.358) 0:02:10.255 ****** 30583 1726853794.91811: entering _queue_task() for managed_node2/set_fact 30583 1726853794.92061: worker is 1 (out of 1 available) 30583 1726853794.92076: exiting _queue_task() for managed_node2/set_fact 30583 1726853794.92090: done queuing things up, now waiting for results queue to drain 30583 1726853794.92091: waiting for pending results... 30583 1726853794.92283: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 30583 1726853794.92381: in run() - task 02083763-bbaf-05ea-abc5-00000000288a 30583 1726853794.92392: variable 'ansible_search_path' from source: unknown 30583 1726853794.92396: variable 'ansible_search_path' from source: unknown 30583 1726853794.92425: calling self._execute() 30583 1726853794.92505: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.92508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.92516: variable 'omit' from source: magic vars 30583 1726853794.92804: variable 'ansible_distribution_major_version' from source: facts 30583 1726853794.92813: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853794.92902: variable 'profile_stat' from source: set_fact 30583 1726853794.92910: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853794.92914: when evaluation is False, skipping this task 30583 1726853794.92917: _execute() done 30583 1726853794.92919: dumping result to json 30583 1726853794.92922: done dumping result, returning 30583 1726853794.92929: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-05ea-abc5-00000000288a] 30583 1726853794.92932: sending task result for task 02083763-bbaf-05ea-abc5-00000000288a 30583 1726853794.93013: done sending task result for task 02083763-bbaf-05ea-abc5-00000000288a 30583 1726853794.93015: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853794.93063: no more pending results, returning what we have 30583 1726853794.93068: results queue empty 30583 1726853794.93069: checking for any_errors_fatal 30583 1726853794.93085: done checking for any_errors_fatal 30583 1726853794.93085: checking for max_fail_percentage 30583 1726853794.93087: done checking for max_fail_percentage 30583 1726853794.93088: checking to see if all hosts have failed and the running result is not ok 30583 1726853794.93089: done checking to see if all hosts have failed 30583 1726853794.93089: getting the remaining hosts for this loop 30583 1726853794.93091: done getting the remaining hosts for this loop 30583 1726853794.93095: getting the next task for host managed_node2 30583 1726853794.93103: done getting next task for host managed_node2 30583 1726853794.93105: ^ task is: TASK: Get NM profile info 30583 1726853794.93110: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853794.93115: getting variables 30583 1726853794.93116: in VariableManager get_vars() 30583 1726853794.93153: Calling all_inventory to load vars for managed_node2 30583 1726853794.93156: Calling groups_inventory to load vars for managed_node2 30583 1726853794.93162: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853794.93178: Calling all_plugins_play to load vars for managed_node2 30583 1726853794.93181: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853794.93184: Calling groups_plugins_play to load vars for managed_node2 30583 1726853794.93955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853794.94829: done with get_vars() 30583 1726853794.94844: done getting variables 30583 1726853794.94890: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:36:34 -0400 (0:00:00.031) 0:02:10.286 ****** 30583 1726853794.94915: entering _queue_task() for managed_node2/shell 30583 1726853794.95141: worker is 1 (out of 1 available) 30583 1726853794.95153: exiting _queue_task() for managed_node2/shell 30583 1726853794.95169: done queuing things up, now waiting for results queue to drain 30583 1726853794.95172: waiting for pending results... 30583 1726853794.95348: running TaskExecutor() for managed_node2/TASK: Get NM profile info 30583 1726853794.95432: in run() - task 02083763-bbaf-05ea-abc5-00000000288b 30583 1726853794.95444: variable 'ansible_search_path' from source: unknown 30583 1726853794.95448: variable 'ansible_search_path' from source: unknown 30583 1726853794.95477: calling self._execute() 30583 1726853794.95556: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.95562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.95569: variable 'omit' from source: magic vars 30583 1726853794.95844: variable 'ansible_distribution_major_version' from source: facts 30583 1726853794.95854: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853794.95864: variable 'omit' from source: magic vars 30583 1726853794.95901: variable 'omit' from source: magic vars 30583 1726853794.95975: variable 'profile' from source: play vars 30583 1726853794.95978: variable 'interface' from source: play vars 30583 1726853794.96024: variable 'interface' from source: play vars 30583 1726853794.96039: variable 'omit' from source: magic vars 30583 1726853794.96076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853794.96102: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853794.96119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853794.96132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853794.96141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853794.96168: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853794.96173: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.96176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.96242: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853794.96247: Set connection var ansible_timeout to 10 30583 1726853794.96250: Set connection var ansible_connection to ssh 30583 1726853794.96255: Set connection var ansible_shell_executable to /bin/sh 30583 1726853794.96262: Set connection var ansible_shell_type to sh 30583 1726853794.96270: Set connection var ansible_pipelining to False 30583 1726853794.96288: variable 'ansible_shell_executable' from source: unknown 30583 1726853794.96291: variable 'ansible_connection' from source: unknown 30583 1726853794.96293: variable 'ansible_module_compression' from source: unknown 30583 1726853794.96295: variable 'ansible_shell_type' from source: unknown 30583 1726853794.96297: variable 'ansible_shell_executable' from source: unknown 30583 1726853794.96300: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853794.96304: variable 'ansible_pipelining' from source: unknown 30583 1726853794.96307: variable 'ansible_timeout' from source: unknown 30583 1726853794.96309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853794.96410: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853794.96420: variable 'omit' from source: magic vars 30583 1726853794.96425: starting attempt loop 30583 1726853794.96428: running the handler 30583 1726853794.96436: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853794.96453: _low_level_execute_command(): starting 30583 1726853794.96462: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853794.96986: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.96990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.96993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853794.96995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.97046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853794.97049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853794.97051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.97131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853794.98901: stdout chunk (state=3): >>>/root <<< 30583 1726853794.98998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853794.99027: stderr chunk (state=3): >>><<< 30583 1726853794.99030: stdout chunk (state=3): >>><<< 30583 1726853794.99051: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853794.99064: _low_level_execute_command(): starting 30583 1726853794.99070: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683 `" && echo ansible-tmp-1726853794.990509-36073-53874505024683="` echo /root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683 `" ) && sleep 0' 30583 1726853794.99517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853794.99520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853794.99522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address <<< 30583 1726853794.99525: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853794.99527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853794.99584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853794.99587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853794.99654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853795.01722: stdout chunk (state=3): >>>ansible-tmp-1726853794.990509-36073-53874505024683=/root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683 <<< 30583 1726853795.01837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853795.01863: stderr chunk (state=3): >>><<< 30583 1726853795.01866: stdout chunk (state=3): >>><<< 30583 1726853795.01882: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853794.990509-36073-53874505024683=/root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853795.01909: variable 'ansible_module_compression' from source: unknown 30583 1726853795.01954: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853795.01987: variable 'ansible_facts' from source: unknown 30583 1726853795.02042: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683/AnsiballZ_command.py 30583 1726853795.02138: Sending initial data 30583 1726853795.02141: Sent initial data (154 bytes) 30583 1726853795.02579: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853795.02583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853795.02585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.02589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853795.02592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.02640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853795.02647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853795.02649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853795.02719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853795.04416: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30583 1726853795.04420: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853795.04485: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853795.04553: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpxlf_tp41 /root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683/AnsiballZ_command.py <<< 30583 1726853795.04559: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683/AnsiballZ_command.py" <<< 30583 1726853795.04624: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpxlf_tp41" to remote "/root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683/AnsiballZ_command.py" <<< 30583 1726853795.04627: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683/AnsiballZ_command.py" <<< 30583 1726853795.05270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853795.05307: stderr chunk (state=3): >>><<< 30583 1726853795.05311: stdout chunk (state=3): >>><<< 30583 1726853795.05341: done transferring module to remote 30583 1726853795.05349: _low_level_execute_command(): starting 30583 1726853795.05354: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683/ /root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683/AnsiballZ_command.py && sleep 0' 30583 1726853795.05784: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853795.05787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853795.05793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.05795: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration <<< 30583 1726853795.05798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853795.05800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.05847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853795.05852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853795.05920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853795.07817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853795.07841: stderr chunk (state=3): >>><<< 30583 1726853795.07844: stdout chunk (state=3): >>><<< 30583 1726853795.07857: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853795.07862: _low_level_execute_command(): starting 30583 1726853795.07867: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683/AnsiballZ_command.py && sleep 0' 30583 1726853795.08287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853795.08290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.08292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853795.08294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.08346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853795.08353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853795.08354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853795.08429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853795.25958: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 13:36:35.242085", "end": "2024-09-20 13:36:35.258413", "delta": "0:00:00.016328", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853795.27554: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. <<< 30583 1726853795.27586: stderr chunk (state=3): >>><<< 30583 1726853795.27589: stdout chunk (state=3): >>><<< 30583 1726853795.27605: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 13:36:35.242085", "end": "2024-09-20 13:36:35.258413", "delta": "0:00:00.016328", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. 30583 1726853795.27635: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853795.27645: _low_level_execute_command(): starting 30583 1726853795.27649: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853794.990509-36073-53874505024683/ > /dev/null 2>&1 && sleep 0' 30583 1726853795.28113: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853795.28116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853795.28119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853795.28121: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853795.28123: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.28176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853795.28180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853795.28182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853795.28259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853795.30165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853795.30190: stderr chunk (state=3): >>><<< 30583 1726853795.30193: stdout chunk (state=3): >>><<< 30583 1726853795.30206: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853795.30211: handler run complete 30583 1726853795.30228: Evaluated conditional (False): False 30583 1726853795.30237: attempt loop complete, returning result 30583 1726853795.30240: _execute() done 30583 1726853795.30242: dumping result to json 30583 1726853795.30247: done dumping result, returning 30583 1726853795.30255: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [02083763-bbaf-05ea-abc5-00000000288b] 30583 1726853795.30259: sending task result for task 02083763-bbaf-05ea-abc5-00000000288b 30583 1726853795.30357: done sending task result for task 02083763-bbaf-05ea-abc5-00000000288b 30583 1726853795.30360: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.016328", "end": "2024-09-20 13:36:35.258413", "rc": 1, "start": "2024-09-20 13:36:35.242085" } MSG: non-zero return code ...ignoring 30583 1726853795.30445: no more pending results, returning what we have 30583 1726853795.30449: results queue empty 30583 1726853795.30450: checking for any_errors_fatal 30583 1726853795.30456: done checking for any_errors_fatal 30583 1726853795.30457: checking for max_fail_percentage 30583 1726853795.30459: done checking for max_fail_percentage 30583 1726853795.30460: checking to see if all hosts have failed and the running result is not ok 30583 1726853795.30461: done checking to see if all hosts have failed 30583 1726853795.30461: getting the remaining hosts for this loop 30583 1726853795.30464: done getting the remaining hosts for this loop 30583 1726853795.30467: getting the next task for host managed_node2 30583 1726853795.30477: done getting next task for host managed_node2 30583 1726853795.30480: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30583 1726853795.30485: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853795.30489: getting variables 30583 1726853795.30490: in VariableManager get_vars() 30583 1726853795.30531: Calling all_inventory to load vars for managed_node2 30583 1726853795.30533: Calling groups_inventory to load vars for managed_node2 30583 1726853795.30537: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853795.30546: Calling all_plugins_play to load vars for managed_node2 30583 1726853795.30549: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853795.30551: Calling groups_plugins_play to load vars for managed_node2 30583 1726853795.31520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853795.32356: done with get_vars() 30583 1726853795.32374: done getting variables 30583 1726853795.32416: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:36:35 -0400 (0:00:00.375) 0:02:10.661 ****** 30583 1726853795.32440: entering _queue_task() for managed_node2/set_fact 30583 1726853795.32659: worker is 1 (out of 1 available) 30583 1726853795.32676: exiting _queue_task() for managed_node2/set_fact 30583 1726853795.32689: done queuing things up, now waiting for results queue to drain 30583 1726853795.32690: waiting for pending results... 30583 1726853795.32881: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30583 1726853795.32961: in run() - task 02083763-bbaf-05ea-abc5-00000000288c 30583 1726853795.32976: variable 'ansible_search_path' from source: unknown 30583 1726853795.32979: variable 'ansible_search_path' from source: unknown 30583 1726853795.33007: calling self._execute() 30583 1726853795.33091: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853795.33094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853795.33105: variable 'omit' from source: magic vars 30583 1726853795.33389: variable 'ansible_distribution_major_version' from source: facts 30583 1726853795.33398: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853795.33496: variable 'nm_profile_exists' from source: set_fact 30583 1726853795.33506: Evaluated conditional (nm_profile_exists.rc == 0): False 30583 1726853795.33509: when evaluation is False, skipping this task 30583 1726853795.33512: _execute() done 30583 1726853795.33514: dumping result to json 30583 1726853795.33517: done dumping result, returning 30583 1726853795.33525: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-05ea-abc5-00000000288c] 30583 1726853795.33528: sending task result for task 02083763-bbaf-05ea-abc5-00000000288c skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30583 1726853795.33656: no more pending results, returning what we have 30583 1726853795.33661: results queue empty 30583 1726853795.33662: checking for any_errors_fatal 30583 1726853795.33676: done checking for any_errors_fatal 30583 1726853795.33677: checking for max_fail_percentage 30583 1726853795.33679: done checking for max_fail_percentage 30583 1726853795.33681: checking to see if all hosts have failed and the running result is not ok 30583 1726853795.33682: done checking to see if all hosts have failed 30583 1726853795.33682: getting the remaining hosts for this loop 30583 1726853795.33685: done getting the remaining hosts for this loop 30583 1726853795.33688: getting the next task for host managed_node2 30583 1726853795.33698: done getting next task for host managed_node2 30583 1726853795.33701: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30583 1726853795.33706: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853795.33709: getting variables 30583 1726853795.33711: in VariableManager get_vars() 30583 1726853795.33744: Calling all_inventory to load vars for managed_node2 30583 1726853795.33747: Calling groups_inventory to load vars for managed_node2 30583 1726853795.33750: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853795.33758: Calling all_plugins_play to load vars for managed_node2 30583 1726853795.33761: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853795.33763: Calling groups_plugins_play to load vars for managed_node2 30583 1726853795.34285: done sending task result for task 02083763-bbaf-05ea-abc5-00000000288c 30583 1726853795.34288: WORKER PROCESS EXITING 30583 1726853795.34536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853795.35393: done with get_vars() 30583 1726853795.35409: done getting variables 30583 1726853795.35450: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853795.35531: variable 'profile' from source: play vars 30583 1726853795.35534: variable 'interface' from source: play vars 30583 1726853795.35575: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:36:35 -0400 (0:00:00.031) 0:02:10.693 ****** 30583 1726853795.35598: entering _queue_task() for managed_node2/command 30583 1726853795.35809: worker is 1 (out of 1 available) 30583 1726853795.35823: exiting _queue_task() for managed_node2/command 30583 1726853795.35836: done queuing things up, now waiting for results queue to drain 30583 1726853795.35837: waiting for pending results... 30583 1726853795.36016: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 30583 1726853795.36101: in run() - task 02083763-bbaf-05ea-abc5-00000000288e 30583 1726853795.36111: variable 'ansible_search_path' from source: unknown 30583 1726853795.36116: variable 'ansible_search_path' from source: unknown 30583 1726853795.36142: calling self._execute() 30583 1726853795.36220: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853795.36224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853795.36232: variable 'omit' from source: magic vars 30583 1726853795.36498: variable 'ansible_distribution_major_version' from source: facts 30583 1726853795.36507: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853795.36592: variable 'profile_stat' from source: set_fact 30583 1726853795.36606: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853795.36609: when evaluation is False, skipping this task 30583 1726853795.36612: _execute() done 30583 1726853795.36614: dumping result to json 30583 1726853795.36616: done dumping result, returning 30583 1726853795.36619: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-00000000288e] 30583 1726853795.36622: sending task result for task 02083763-bbaf-05ea-abc5-00000000288e 30583 1726853795.36700: done sending task result for task 02083763-bbaf-05ea-abc5-00000000288e 30583 1726853795.36703: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853795.36755: no more pending results, returning what we have 30583 1726853795.36759: results queue empty 30583 1726853795.36760: checking for any_errors_fatal 30583 1726853795.36766: done checking for any_errors_fatal 30583 1726853795.36767: checking for max_fail_percentage 30583 1726853795.36768: done checking for max_fail_percentage 30583 1726853795.36769: checking to see if all hosts have failed and the running result is not ok 30583 1726853795.36770: done checking to see if all hosts have failed 30583 1726853795.36772: getting the remaining hosts for this loop 30583 1726853795.36774: done getting the remaining hosts for this loop 30583 1726853795.36778: getting the next task for host managed_node2 30583 1726853795.36786: done getting next task for host managed_node2 30583 1726853795.36788: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30583 1726853795.36793: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853795.36796: getting variables 30583 1726853795.36797: in VariableManager get_vars() 30583 1726853795.36830: Calling all_inventory to load vars for managed_node2 30583 1726853795.36833: Calling groups_inventory to load vars for managed_node2 30583 1726853795.36835: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853795.36844: Calling all_plugins_play to load vars for managed_node2 30583 1726853795.36846: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853795.36849: Calling groups_plugins_play to load vars for managed_node2 30583 1726853795.37720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853795.38563: done with get_vars() 30583 1726853795.38580: done getting variables 30583 1726853795.38622: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853795.38696: variable 'profile' from source: play vars 30583 1726853795.38699: variable 'interface' from source: play vars 30583 1726853795.38737: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:36:35 -0400 (0:00:00.031) 0:02:10.724 ****** 30583 1726853795.38759: entering _queue_task() for managed_node2/set_fact 30583 1726853795.38970: worker is 1 (out of 1 available) 30583 1726853795.38985: exiting _queue_task() for managed_node2/set_fact 30583 1726853795.39000: done queuing things up, now waiting for results queue to drain 30583 1726853795.39001: waiting for pending results... 30583 1726853795.39180: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 30583 1726853795.39273: in run() - task 02083763-bbaf-05ea-abc5-00000000288f 30583 1726853795.39283: variable 'ansible_search_path' from source: unknown 30583 1726853795.39287: variable 'ansible_search_path' from source: unknown 30583 1726853795.39314: calling self._execute() 30583 1726853795.39394: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853795.39398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853795.39408: variable 'omit' from source: magic vars 30583 1726853795.39674: variable 'ansible_distribution_major_version' from source: facts 30583 1726853795.39683: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853795.39770: variable 'profile_stat' from source: set_fact 30583 1726853795.39780: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853795.39782: when evaluation is False, skipping this task 30583 1726853795.39785: _execute() done 30583 1726853795.39788: dumping result to json 30583 1726853795.39790: done dumping result, returning 30583 1726853795.39796: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-00000000288f] 30583 1726853795.39802: sending task result for task 02083763-bbaf-05ea-abc5-00000000288f 30583 1726853795.39886: done sending task result for task 02083763-bbaf-05ea-abc5-00000000288f 30583 1726853795.39890: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853795.39932: no more pending results, returning what we have 30583 1726853795.39936: results queue empty 30583 1726853795.39937: checking for any_errors_fatal 30583 1726853795.39943: done checking for any_errors_fatal 30583 1726853795.39943: checking for max_fail_percentage 30583 1726853795.39945: done checking for max_fail_percentage 30583 1726853795.39946: checking to see if all hosts have failed and the running result is not ok 30583 1726853795.39947: done checking to see if all hosts have failed 30583 1726853795.39947: getting the remaining hosts for this loop 30583 1726853795.39949: done getting the remaining hosts for this loop 30583 1726853795.39952: getting the next task for host managed_node2 30583 1726853795.39960: done getting next task for host managed_node2 30583 1726853795.39963: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30583 1726853795.39968: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853795.39973: getting variables 30583 1726853795.39975: in VariableManager get_vars() 30583 1726853795.40011: Calling all_inventory to load vars for managed_node2 30583 1726853795.40014: Calling groups_inventory to load vars for managed_node2 30583 1726853795.40017: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853795.40026: Calling all_plugins_play to load vars for managed_node2 30583 1726853795.40028: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853795.40031: Calling groups_plugins_play to load vars for managed_node2 30583 1726853795.40787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853795.41746: done with get_vars() 30583 1726853795.41761: done getting variables 30583 1726853795.41803: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853795.41876: variable 'profile' from source: play vars 30583 1726853795.41879: variable 'interface' from source: play vars 30583 1726853795.41917: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:36:35 -0400 (0:00:00.031) 0:02:10.756 ****** 30583 1726853795.41939: entering _queue_task() for managed_node2/command 30583 1726853795.42151: worker is 1 (out of 1 available) 30583 1726853795.42164: exiting _queue_task() for managed_node2/command 30583 1726853795.42178: done queuing things up, now waiting for results queue to drain 30583 1726853795.42179: waiting for pending results... 30583 1726853795.42351: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 30583 1726853795.42437: in run() - task 02083763-bbaf-05ea-abc5-000000002890 30583 1726853795.42447: variable 'ansible_search_path' from source: unknown 30583 1726853795.42451: variable 'ansible_search_path' from source: unknown 30583 1726853795.42482: calling self._execute() 30583 1726853795.42555: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853795.42558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853795.42569: variable 'omit' from source: magic vars 30583 1726853795.42841: variable 'ansible_distribution_major_version' from source: facts 30583 1726853795.42854: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853795.42937: variable 'profile_stat' from source: set_fact 30583 1726853795.42945: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853795.42954: when evaluation is False, skipping this task 30583 1726853795.42957: _execute() done 30583 1726853795.42959: dumping result to json 30583 1726853795.42962: done dumping result, returning 30583 1726853795.42969: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000002890] 30583 1726853795.42973: sending task result for task 02083763-bbaf-05ea-abc5-000000002890 30583 1726853795.43052: done sending task result for task 02083763-bbaf-05ea-abc5-000000002890 30583 1726853795.43057: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853795.43109: no more pending results, returning what we have 30583 1726853795.43113: results queue empty 30583 1726853795.43114: checking for any_errors_fatal 30583 1726853795.43119: done checking for any_errors_fatal 30583 1726853795.43119: checking for max_fail_percentage 30583 1726853795.43121: done checking for max_fail_percentage 30583 1726853795.43122: checking to see if all hosts have failed and the running result is not ok 30583 1726853795.43123: done checking to see if all hosts have failed 30583 1726853795.43123: getting the remaining hosts for this loop 30583 1726853795.43125: done getting the remaining hosts for this loop 30583 1726853795.43129: getting the next task for host managed_node2 30583 1726853795.43137: done getting next task for host managed_node2 30583 1726853795.43139: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30583 1726853795.43144: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853795.43148: getting variables 30583 1726853795.43149: in VariableManager get_vars() 30583 1726853795.43185: Calling all_inventory to load vars for managed_node2 30583 1726853795.43187: Calling groups_inventory to load vars for managed_node2 30583 1726853795.43190: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853795.43199: Calling all_plugins_play to load vars for managed_node2 30583 1726853795.43201: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853795.43204: Calling groups_plugins_play to load vars for managed_node2 30583 1726853795.43955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853795.44816: done with get_vars() 30583 1726853795.44831: done getting variables 30583 1726853795.44875: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853795.44946: variable 'profile' from source: play vars 30583 1726853795.44949: variable 'interface' from source: play vars 30583 1726853795.44990: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:36:35 -0400 (0:00:00.030) 0:02:10.787 ****** 30583 1726853795.45013: entering _queue_task() for managed_node2/set_fact 30583 1726853795.45243: worker is 1 (out of 1 available) 30583 1726853795.45257: exiting _queue_task() for managed_node2/set_fact 30583 1726853795.45269: done queuing things up, now waiting for results queue to drain 30583 1726853795.45272: waiting for pending results... 30583 1726853795.45454: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 30583 1726853795.45541: in run() - task 02083763-bbaf-05ea-abc5-000000002891 30583 1726853795.45552: variable 'ansible_search_path' from source: unknown 30583 1726853795.45556: variable 'ansible_search_path' from source: unknown 30583 1726853795.45590: calling self._execute() 30583 1726853795.45670: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853795.45676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853795.45685: variable 'omit' from source: magic vars 30583 1726853795.45961: variable 'ansible_distribution_major_version' from source: facts 30583 1726853795.45969: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853795.46052: variable 'profile_stat' from source: set_fact 30583 1726853795.46062: Evaluated conditional (profile_stat.stat.exists): False 30583 1726853795.46066: when evaluation is False, skipping this task 30583 1726853795.46068: _execute() done 30583 1726853795.46073: dumping result to json 30583 1726853795.46075: done dumping result, returning 30583 1726853795.46078: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [02083763-bbaf-05ea-abc5-000000002891] 30583 1726853795.46083: sending task result for task 02083763-bbaf-05ea-abc5-000000002891 30583 1726853795.46168: done sending task result for task 02083763-bbaf-05ea-abc5-000000002891 30583 1726853795.46172: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30583 1726853795.46218: no more pending results, returning what we have 30583 1726853795.46222: results queue empty 30583 1726853795.46223: checking for any_errors_fatal 30583 1726853795.46228: done checking for any_errors_fatal 30583 1726853795.46229: checking for max_fail_percentage 30583 1726853795.46231: done checking for max_fail_percentage 30583 1726853795.46232: checking to see if all hosts have failed and the running result is not ok 30583 1726853795.46233: done checking to see if all hosts have failed 30583 1726853795.46233: getting the remaining hosts for this loop 30583 1726853795.46236: done getting the remaining hosts for this loop 30583 1726853795.46239: getting the next task for host managed_node2 30583 1726853795.46248: done getting next task for host managed_node2 30583 1726853795.46250: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30583 1726853795.46254: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853795.46260: getting variables 30583 1726853795.46262: in VariableManager get_vars() 30583 1726853795.46303: Calling all_inventory to load vars for managed_node2 30583 1726853795.46305: Calling groups_inventory to load vars for managed_node2 30583 1726853795.46308: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853795.46317: Calling all_plugins_play to load vars for managed_node2 30583 1726853795.46320: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853795.46322: Calling groups_plugins_play to load vars for managed_node2 30583 1726853795.47234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853795.48074: done with get_vars() 30583 1726853795.48088: done getting variables 30583 1726853795.48129: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853795.48204: variable 'profile' from source: play vars 30583 1726853795.48208: variable 'interface' from source: play vars 30583 1726853795.48247: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 13:36:35 -0400 (0:00:00.032) 0:02:10.819 ****** 30583 1726853795.48269: entering _queue_task() for managed_node2/assert 30583 1726853795.48489: worker is 1 (out of 1 available) 30583 1726853795.48503: exiting _queue_task() for managed_node2/assert 30583 1726853795.48515: done queuing things up, now waiting for results queue to drain 30583 1726853795.48516: waiting for pending results... 30583 1726853795.48707: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' 30583 1726853795.48794: in run() - task 02083763-bbaf-05ea-abc5-000000002805 30583 1726853795.48805: variable 'ansible_search_path' from source: unknown 30583 1726853795.48808: variable 'ansible_search_path' from source: unknown 30583 1726853795.48837: calling self._execute() 30583 1726853795.48914: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853795.48920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853795.48928: variable 'omit' from source: magic vars 30583 1726853795.49206: variable 'ansible_distribution_major_version' from source: facts 30583 1726853795.49215: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853795.49221: variable 'omit' from source: magic vars 30583 1726853795.49254: variable 'omit' from source: magic vars 30583 1726853795.49328: variable 'profile' from source: play vars 30583 1726853795.49332: variable 'interface' from source: play vars 30583 1726853795.49380: variable 'interface' from source: play vars 30583 1726853795.49397: variable 'omit' from source: magic vars 30583 1726853795.49430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853795.49457: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853795.49477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853795.49491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853795.49501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853795.49527: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853795.49530: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853795.49532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853795.49606: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853795.49610: Set connection var ansible_timeout to 10 30583 1726853795.49612: Set connection var ansible_connection to ssh 30583 1726853795.49621: Set connection var ansible_shell_executable to /bin/sh 30583 1726853795.49623: Set connection var ansible_shell_type to sh 30583 1726853795.49629: Set connection var ansible_pipelining to False 30583 1726853795.49646: variable 'ansible_shell_executable' from source: unknown 30583 1726853795.49649: variable 'ansible_connection' from source: unknown 30583 1726853795.49651: variable 'ansible_module_compression' from source: unknown 30583 1726853795.49654: variable 'ansible_shell_type' from source: unknown 30583 1726853795.49656: variable 'ansible_shell_executable' from source: unknown 30583 1726853795.49658: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853795.49664: variable 'ansible_pipelining' from source: unknown 30583 1726853795.49667: variable 'ansible_timeout' from source: unknown 30583 1726853795.49669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853795.49770: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853795.49780: variable 'omit' from source: magic vars 30583 1726853795.49786: starting attempt loop 30583 1726853795.49789: running the handler 30583 1726853795.49878: variable 'lsr_net_profile_exists' from source: set_fact 30583 1726853795.49882: Evaluated conditional (not lsr_net_profile_exists): True 30583 1726853795.49888: handler run complete 30583 1726853795.49899: attempt loop complete, returning result 30583 1726853795.49902: _execute() done 30583 1726853795.49904: dumping result to json 30583 1726853795.49907: done dumping result, returning 30583 1726853795.49913: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' [02083763-bbaf-05ea-abc5-000000002805] 30583 1726853795.49917: sending task result for task 02083763-bbaf-05ea-abc5-000000002805 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853795.50046: no more pending results, returning what we have 30583 1726853795.50049: results queue empty 30583 1726853795.50050: checking for any_errors_fatal 30583 1726853795.50055: done checking for any_errors_fatal 30583 1726853795.50056: checking for max_fail_percentage 30583 1726853795.50058: done checking for max_fail_percentage 30583 1726853795.50059: checking to see if all hosts have failed and the running result is not ok 30583 1726853795.50060: done checking to see if all hosts have failed 30583 1726853795.50060: getting the remaining hosts for this loop 30583 1726853795.50064: done getting the remaining hosts for this loop 30583 1726853795.50068: getting the next task for host managed_node2 30583 1726853795.50081: done getting next task for host managed_node2 30583 1726853795.50084: ^ task is: TASK: Get NetworkManager RPM version 30583 1726853795.50090: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853795.50094: getting variables 30583 1726853795.50096: in VariableManager get_vars() 30583 1726853795.50135: Calling all_inventory to load vars for managed_node2 30583 1726853795.50138: Calling groups_inventory to load vars for managed_node2 30583 1726853795.50141: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853795.50151: Calling all_plugins_play to load vars for managed_node2 30583 1726853795.50154: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853795.50156: Calling groups_plugins_play to load vars for managed_node2 30583 1726853795.50685: done sending task result for task 02083763-bbaf-05ea-abc5-000000002805 30583 1726853795.50688: WORKER PROCESS EXITING 30583 1726853795.50978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853795.51836: done with get_vars() 30583 1726853795.51852: done getting variables 30583 1726853795.51895: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NetworkManager RPM version] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:7 Friday 20 September 2024 13:36:35 -0400 (0:00:00.036) 0:02:10.856 ****** 30583 1726853795.51922: entering _queue_task() for managed_node2/command 30583 1726853795.52160: worker is 1 (out of 1 available) 30583 1726853795.52176: exiting _queue_task() for managed_node2/command 30583 1726853795.52189: done queuing things up, now waiting for results queue to drain 30583 1726853795.52190: waiting for pending results... 30583 1726853795.52383: running TaskExecutor() for managed_node2/TASK: Get NetworkManager RPM version 30583 1726853795.52453: in run() - task 02083763-bbaf-05ea-abc5-000000002809 30583 1726853795.52468: variable 'ansible_search_path' from source: unknown 30583 1726853795.52473: variable 'ansible_search_path' from source: unknown 30583 1726853795.52502: calling self._execute() 30583 1726853795.52582: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853795.52586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853795.52594: variable 'omit' from source: magic vars 30583 1726853795.52877: variable 'ansible_distribution_major_version' from source: facts 30583 1726853795.52887: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853795.52893: variable 'omit' from source: magic vars 30583 1726853795.52923: variable 'omit' from source: magic vars 30583 1726853795.52947: variable 'omit' from source: magic vars 30583 1726853795.52985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853795.53011: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853795.53028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853795.53042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853795.53052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853795.53083: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853795.53087: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853795.53089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853795.53155: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853795.53162: Set connection var ansible_timeout to 10 30583 1726853795.53166: Set connection var ansible_connection to ssh 30583 1726853795.53173: Set connection var ansible_shell_executable to /bin/sh 30583 1726853795.53176: Set connection var ansible_shell_type to sh 30583 1726853795.53185: Set connection var ansible_pipelining to False 30583 1726853795.53204: variable 'ansible_shell_executable' from source: unknown 30583 1726853795.53206: variable 'ansible_connection' from source: unknown 30583 1726853795.53209: variable 'ansible_module_compression' from source: unknown 30583 1726853795.53211: variable 'ansible_shell_type' from source: unknown 30583 1726853795.53214: variable 'ansible_shell_executable' from source: unknown 30583 1726853795.53216: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853795.53218: variable 'ansible_pipelining' from source: unknown 30583 1726853795.53221: variable 'ansible_timeout' from source: unknown 30583 1726853795.53225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853795.53327: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853795.53336: variable 'omit' from source: magic vars 30583 1726853795.53341: starting attempt loop 30583 1726853795.53343: running the handler 30583 1726853795.53357: _low_level_execute_command(): starting 30583 1726853795.53366: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853795.53882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853795.53887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853795.53892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.53940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853795.53943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853795.53946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853795.54027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853795.55756: stdout chunk (state=3): >>>/root <<< 30583 1726853795.55857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853795.55887: stderr chunk (state=3): >>><<< 30583 1726853795.55890: stdout chunk (state=3): >>><<< 30583 1726853795.55908: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853795.55922: _low_level_execute_command(): starting 30583 1726853795.55929: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886 `" && echo ansible-tmp-1726853795.5590973-36087-213473280587886="` echo /root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886 `" ) && sleep 0' 30583 1726853795.56352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853795.56355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853795.56357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853795.56366: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853795.56369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.56412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853795.56415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853795.56497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853795.58598: stdout chunk (state=3): >>>ansible-tmp-1726853795.5590973-36087-213473280587886=/root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886 <<< 30583 1726853795.58707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853795.58732: stderr chunk (state=3): >>><<< 30583 1726853795.58736: stdout chunk (state=3): >>><<< 30583 1726853795.58749: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853795.5590973-36087-213473280587886=/root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853795.58778: variable 'ansible_module_compression' from source: unknown 30583 1726853795.58818: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853795.58850: variable 'ansible_facts' from source: unknown 30583 1726853795.58908: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886/AnsiballZ_command.py 30583 1726853795.59006: Sending initial data 30583 1726853795.59009: Sent initial data (156 bytes) 30583 1726853795.59437: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853795.59441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853795.59443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.59445: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853795.59447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.59503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853795.59507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853795.59581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853795.61304: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853795.61375: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853795.61447: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp151l5lwn /root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886/AnsiballZ_command.py <<< 30583 1726853795.61450: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886/AnsiballZ_command.py" <<< 30583 1726853795.61516: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp151l5lwn" to remote "/root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886/AnsiballZ_command.py" <<< 30583 1726853795.62208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853795.62248: stderr chunk (state=3): >>><<< 30583 1726853795.62251: stdout chunk (state=3): >>><<< 30583 1726853795.62288: done transferring module to remote 30583 1726853795.62297: _low_level_execute_command(): starting 30583 1726853795.62302: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886/ /root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886/AnsiballZ_command.py && sleep 0' 30583 1726853795.62726: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853795.62729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853795.62731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853795.62733: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853795.62735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.62788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853795.62791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853795.62867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853795.64795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853795.64815: stderr chunk (state=3): >>><<< 30583 1726853795.64818: stdout chunk (state=3): >>><<< 30583 1726853795.64829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853795.64832: _low_level_execute_command(): starting 30583 1726853795.64837: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886/AnsiballZ_command.py && sleep 0' 30583 1726853795.65249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853795.65252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853795.65254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.65256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853795.65261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853795.65306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853795.65309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853795.65396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853795.97795: stdout chunk (state=3): >>> {"changed": true, "stdout": "NetworkManager-1.48.10-1.el10", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-20 13:36:35.811208", "end": "2024-09-20 13:36:35.976746", "delta": "0:00:00.165538", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853795.99444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853795.99478: stderr chunk (state=3): >>><<< 30583 1726853795.99481: stdout chunk (state=3): >>><<< 30583 1726853795.99497: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "NetworkManager-1.48.10-1.el10", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-20 13:36:35.811208", "end": "2024-09-20 13:36:35.976746", "delta": "0:00:00.165538", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853795.99527: done with _execute_module (ansible.legacy.command, {'_raw_params': "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853795.99535: _low_level_execute_command(): starting 30583 1726853795.99540: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853795.5590973-36087-213473280587886/ > /dev/null 2>&1 && sleep 0' 30583 1726853795.99999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853796.00002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853796.00005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.00014: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853796.00017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853796.00019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.00061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853796.00064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853796.00066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853796.00144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853796.02055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853796.02083: stderr chunk (state=3): >>><<< 30583 1726853796.02086: stdout chunk (state=3): >>><<< 30583 1726853796.02098: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853796.02103: handler run complete 30583 1726853796.02122: Evaluated conditional (False): False 30583 1726853796.02131: attempt loop complete, returning result 30583 1726853796.02133: _execute() done 30583 1726853796.02136: dumping result to json 30583 1726853796.02141: done dumping result, returning 30583 1726853796.02149: done running TaskExecutor() for managed_node2/TASK: Get NetworkManager RPM version [02083763-bbaf-05ea-abc5-000000002809] 30583 1726853796.02153: sending task result for task 02083763-bbaf-05ea-abc5-000000002809 30583 1726853796.02254: done sending task result for task 02083763-bbaf-05ea-abc5-000000002809 30583 1726853796.02257: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager" ], "delta": "0:00:00.165538", "end": "2024-09-20 13:36:35.976746", "rc": 0, "start": "2024-09-20 13:36:35.811208" } STDOUT: NetworkManager-1.48.10-1.el10 30583 1726853796.02346: no more pending results, returning what we have 30583 1726853796.02350: results queue empty 30583 1726853796.02351: checking for any_errors_fatal 30583 1726853796.02358: done checking for any_errors_fatal 30583 1726853796.02358: checking for max_fail_percentage 30583 1726853796.02360: done checking for max_fail_percentage 30583 1726853796.02361: checking to see if all hosts have failed and the running result is not ok 30583 1726853796.02362: done checking to see if all hosts have failed 30583 1726853796.02362: getting the remaining hosts for this loop 30583 1726853796.02364: done getting the remaining hosts for this loop 30583 1726853796.02367: getting the next task for host managed_node2 30583 1726853796.02376: done getting next task for host managed_node2 30583 1726853796.02381: ^ task is: TASK: Store NetworkManager version 30583 1726853796.02385: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853796.02390: getting variables 30583 1726853796.02391: in VariableManager get_vars() 30583 1726853796.02434: Calling all_inventory to load vars for managed_node2 30583 1726853796.02436: Calling groups_inventory to load vars for managed_node2 30583 1726853796.02440: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853796.02450: Calling all_plugins_play to load vars for managed_node2 30583 1726853796.02452: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853796.02455: Calling groups_plugins_play to load vars for managed_node2 30583 1726853796.03415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853796.04262: done with get_vars() 30583 1726853796.04281: done getting variables 30583 1726853796.04325: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Store NetworkManager version] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:14 Friday 20 September 2024 13:36:36 -0400 (0:00:00.524) 0:02:11.380 ****** 30583 1726853796.04347: entering _queue_task() for managed_node2/set_fact 30583 1726853796.04598: worker is 1 (out of 1 available) 30583 1726853796.04614: exiting _queue_task() for managed_node2/set_fact 30583 1726853796.04627: done queuing things up, now waiting for results queue to drain 30583 1726853796.04629: waiting for pending results... 30583 1726853796.04823: running TaskExecutor() for managed_node2/TASK: Store NetworkManager version 30583 1726853796.04915: in run() - task 02083763-bbaf-05ea-abc5-00000000280a 30583 1726853796.04926: variable 'ansible_search_path' from source: unknown 30583 1726853796.04931: variable 'ansible_search_path' from source: unknown 30583 1726853796.04960: calling self._execute() 30583 1726853796.05038: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.05042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.05050: variable 'omit' from source: magic vars 30583 1726853796.05339: variable 'ansible_distribution_major_version' from source: facts 30583 1726853796.05348: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853796.05354: variable 'omit' from source: magic vars 30583 1726853796.05389: variable 'omit' from source: magic vars 30583 1726853796.05468: variable '__rpm_q_networkmanager' from source: set_fact 30583 1726853796.05486: variable 'omit' from source: magic vars 30583 1726853796.05520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853796.05546: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853796.05567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853796.05582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853796.05592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853796.05618: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853796.05621: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.05624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.05695: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853796.05699: Set connection var ansible_timeout to 10 30583 1726853796.05702: Set connection var ansible_connection to ssh 30583 1726853796.05707: Set connection var ansible_shell_executable to /bin/sh 30583 1726853796.05710: Set connection var ansible_shell_type to sh 30583 1726853796.05717: Set connection var ansible_pipelining to False 30583 1726853796.05740: variable 'ansible_shell_executable' from source: unknown 30583 1726853796.05743: variable 'ansible_connection' from source: unknown 30583 1726853796.05746: variable 'ansible_module_compression' from source: unknown 30583 1726853796.05748: variable 'ansible_shell_type' from source: unknown 30583 1726853796.05750: variable 'ansible_shell_executable' from source: unknown 30583 1726853796.05752: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.05754: variable 'ansible_pipelining' from source: unknown 30583 1726853796.05756: variable 'ansible_timeout' from source: unknown 30583 1726853796.05758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.05857: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853796.05869: variable 'omit' from source: magic vars 30583 1726853796.05875: starting attempt loop 30583 1726853796.05878: running the handler 30583 1726853796.05887: handler run complete 30583 1726853796.05896: attempt loop complete, returning result 30583 1726853796.05898: _execute() done 30583 1726853796.05901: dumping result to json 30583 1726853796.05903: done dumping result, returning 30583 1726853796.05910: done running TaskExecutor() for managed_node2/TASK: Store NetworkManager version [02083763-bbaf-05ea-abc5-00000000280a] 30583 1726853796.05914: sending task result for task 02083763-bbaf-05ea-abc5-00000000280a 30583 1726853796.05997: done sending task result for task 02083763-bbaf-05ea-abc5-00000000280a 30583 1726853796.06000: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "networkmanager_nvr": "NetworkManager-1.48.10-1.el10" }, "changed": false } 30583 1726853796.06051: no more pending results, returning what we have 30583 1726853796.06054: results queue empty 30583 1726853796.06055: checking for any_errors_fatal 30583 1726853796.06067: done checking for any_errors_fatal 30583 1726853796.06068: checking for max_fail_percentage 30583 1726853796.06070: done checking for max_fail_percentage 30583 1726853796.06073: checking to see if all hosts have failed and the running result is not ok 30583 1726853796.06073: done checking to see if all hosts have failed 30583 1726853796.06074: getting the remaining hosts for this loop 30583 1726853796.06076: done getting the remaining hosts for this loop 30583 1726853796.06080: getting the next task for host managed_node2 30583 1726853796.06088: done getting next task for host managed_node2 30583 1726853796.06090: ^ task is: TASK: Show NetworkManager version 30583 1726853796.06094: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853796.06098: getting variables 30583 1726853796.06100: in VariableManager get_vars() 30583 1726853796.06142: Calling all_inventory to load vars for managed_node2 30583 1726853796.06145: Calling groups_inventory to load vars for managed_node2 30583 1726853796.06148: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853796.06157: Calling all_plugins_play to load vars for managed_node2 30583 1726853796.06160: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853796.06163: Calling groups_plugins_play to load vars for managed_node2 30583 1726853796.06965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853796.07827: done with get_vars() 30583 1726853796.07844: done getting variables 30583 1726853796.07885: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show NetworkManager version] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:18 Friday 20 September 2024 13:36:36 -0400 (0:00:00.035) 0:02:11.416 ****** 30583 1726853796.07909: entering _queue_task() for managed_node2/debug 30583 1726853796.08134: worker is 1 (out of 1 available) 30583 1726853796.08148: exiting _queue_task() for managed_node2/debug 30583 1726853796.08159: done queuing things up, now waiting for results queue to drain 30583 1726853796.08161: waiting for pending results... 30583 1726853796.08352: running TaskExecutor() for managed_node2/TASK: Show NetworkManager version 30583 1726853796.08440: in run() - task 02083763-bbaf-05ea-abc5-00000000280b 30583 1726853796.08450: variable 'ansible_search_path' from source: unknown 30583 1726853796.08454: variable 'ansible_search_path' from source: unknown 30583 1726853796.08485: calling self._execute() 30583 1726853796.08560: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.08567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.08578: variable 'omit' from source: magic vars 30583 1726853796.08853: variable 'ansible_distribution_major_version' from source: facts 30583 1726853796.08864: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853796.08870: variable 'omit' from source: magic vars 30583 1726853796.08903: variable 'omit' from source: magic vars 30583 1726853796.08932: variable 'omit' from source: magic vars 30583 1726853796.08961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853796.08990: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853796.09008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853796.09021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853796.09031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853796.09058: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853796.09063: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.09066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.09135: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853796.09139: Set connection var ansible_timeout to 10 30583 1726853796.09144: Set connection var ansible_connection to ssh 30583 1726853796.09146: Set connection var ansible_shell_executable to /bin/sh 30583 1726853796.09153: Set connection var ansible_shell_type to sh 30583 1726853796.09159: Set connection var ansible_pipelining to False 30583 1726853796.09181: variable 'ansible_shell_executable' from source: unknown 30583 1726853796.09184: variable 'ansible_connection' from source: unknown 30583 1726853796.09187: variable 'ansible_module_compression' from source: unknown 30583 1726853796.09189: variable 'ansible_shell_type' from source: unknown 30583 1726853796.09191: variable 'ansible_shell_executable' from source: unknown 30583 1726853796.09193: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.09195: variable 'ansible_pipelining' from source: unknown 30583 1726853796.09197: variable 'ansible_timeout' from source: unknown 30583 1726853796.09202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.09303: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853796.09312: variable 'omit' from source: magic vars 30583 1726853796.09317: starting attempt loop 30583 1726853796.09320: running the handler 30583 1726853796.09358: variable 'networkmanager_nvr' from source: set_fact 30583 1726853796.09417: variable 'networkmanager_nvr' from source: set_fact 30583 1726853796.09426: handler run complete 30583 1726853796.09441: attempt loop complete, returning result 30583 1726853796.09445: _execute() done 30583 1726853796.09447: dumping result to json 30583 1726853796.09450: done dumping result, returning 30583 1726853796.09454: done running TaskExecutor() for managed_node2/TASK: Show NetworkManager version [02083763-bbaf-05ea-abc5-00000000280b] 30583 1726853796.09458: sending task result for task 02083763-bbaf-05ea-abc5-00000000280b 30583 1726853796.09541: done sending task result for task 02083763-bbaf-05ea-abc5-00000000280b 30583 1726853796.09544: WORKER PROCESS EXITING ok: [managed_node2] => { "networkmanager_nvr": "NetworkManager-1.48.10-1.el10" } 30583 1726853796.09623: no more pending results, returning what we have 30583 1726853796.09626: results queue empty 30583 1726853796.09628: checking for any_errors_fatal 30583 1726853796.09632: done checking for any_errors_fatal 30583 1726853796.09633: checking for max_fail_percentage 30583 1726853796.09635: done checking for max_fail_percentage 30583 1726853796.09635: checking to see if all hosts have failed and the running result is not ok 30583 1726853796.09636: done checking to see if all hosts have failed 30583 1726853796.09637: getting the remaining hosts for this loop 30583 1726853796.09639: done getting the remaining hosts for this loop 30583 1726853796.09642: getting the next task for host managed_node2 30583 1726853796.09650: done getting next task for host managed_node2 30583 1726853796.09654: ^ task is: TASK: Conditional asserts 30583 1726853796.09657: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853796.09661: getting variables 30583 1726853796.09663: in VariableManager get_vars() 30583 1726853796.09702: Calling all_inventory to load vars for managed_node2 30583 1726853796.09704: Calling groups_inventory to load vars for managed_node2 30583 1726853796.09707: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853796.09715: Calling all_plugins_play to load vars for managed_node2 30583 1726853796.09718: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853796.09720: Calling groups_plugins_play to load vars for managed_node2 30583 1726853796.10607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853796.11453: done with get_vars() 30583 1726853796.11469: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 13:36:36 -0400 (0:00:00.036) 0:02:11.452 ****** 30583 1726853796.11537: entering _queue_task() for managed_node2/include_tasks 30583 1726853796.11776: worker is 1 (out of 1 available) 30583 1726853796.11790: exiting _queue_task() for managed_node2/include_tasks 30583 1726853796.11803: done queuing things up, now waiting for results queue to drain 30583 1726853796.11804: waiting for pending results... 30583 1726853796.12028: running TaskExecutor() for managed_node2/TASK: Conditional asserts 30583 1726853796.12097: in run() - task 02083763-bbaf-05ea-abc5-0000000020b3 30583 1726853796.12101: variable 'ansible_search_path' from source: unknown 30583 1726853796.12159: variable 'ansible_search_path' from source: unknown 30583 1726853796.12339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30583 1726853796.13829: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30583 1726853796.13879: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30583 1726853796.13910: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30583 1726853796.13936: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30583 1726853796.13956: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30583 1726853796.14027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30583 1726853796.14047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30583 1726853796.14066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30583 1726853796.14093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30583 1726853796.14104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30583 1726853796.14189: variable 'lsr_assert_when' from source: include params 30583 1726853796.14286: variable 'network_provider' from source: set_fact 30583 1726853796.14342: variable 'omit' from source: magic vars 30583 1726853796.14423: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.14430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.14440: variable 'omit' from source: magic vars 30583 1726853796.14579: variable 'ansible_distribution_major_version' from source: facts 30583 1726853796.14587: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853796.14662: variable 'item' from source: unknown 30583 1726853796.14666: Evaluated conditional (item['condition']): True 30583 1726853796.14721: variable 'item' from source: unknown 30583 1726853796.14743: variable 'item' from source: unknown 30583 1726853796.14792: variable 'item' from source: unknown 30583 1726853796.14930: dumping result to json 30583 1726853796.14933: done dumping result, returning 30583 1726853796.14935: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [02083763-bbaf-05ea-abc5-0000000020b3] 30583 1726853796.14936: sending task result for task 02083763-bbaf-05ea-abc5-0000000020b3 30583 1726853796.14998: no more pending results, returning what we have 30583 1726853796.15003: in VariableManager get_vars() 30583 1726853796.15052: Calling all_inventory to load vars for managed_node2 30583 1726853796.15054: Calling groups_inventory to load vars for managed_node2 30583 1726853796.15060: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853796.15070: Calling all_plugins_play to load vars for managed_node2 30583 1726853796.15075: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853796.15078: Calling groups_plugins_play to load vars for managed_node2 30583 1726853796.15092: done sending task result for task 02083763-bbaf-05ea-abc5-0000000020b3 30583 1726853796.15094: WORKER PROCESS EXITING 30583 1726853796.15912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853796.16877: done with get_vars() 30583 1726853796.16891: variable 'ansible_search_path' from source: unknown 30583 1726853796.16891: variable 'ansible_search_path' from source: unknown 30583 1726853796.16918: we have included files to process 30583 1726853796.16919: generating all_blocks data 30583 1726853796.16920: done generating all_blocks data 30583 1726853796.16924: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853796.16925: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853796.16927: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30583 1726853796.17001: in VariableManager get_vars() 30583 1726853796.17016: done with get_vars() 30583 1726853796.17094: done processing included file 30583 1726853796.17095: iterating over new_blocks loaded from include file 30583 1726853796.17096: in VariableManager get_vars() 30583 1726853796.17106: done with get_vars() 30583 1726853796.17107: filtering new block on tags 30583 1726853796.17128: done filtering new block on tags 30583 1726853796.17130: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 30583 1726853796.17134: extending task lists for all hosts with included blocks 30583 1726853796.17912: done extending task lists 30583 1726853796.17913: done processing included files 30583 1726853796.17914: results queue empty 30583 1726853796.17914: checking for any_errors_fatal 30583 1726853796.17917: done checking for any_errors_fatal 30583 1726853796.17917: checking for max_fail_percentage 30583 1726853796.17918: done checking for max_fail_percentage 30583 1726853796.17919: checking to see if all hosts have failed and the running result is not ok 30583 1726853796.17919: done checking to see if all hosts have failed 30583 1726853796.17920: getting the remaining hosts for this loop 30583 1726853796.17921: done getting the remaining hosts for this loop 30583 1726853796.17922: getting the next task for host managed_node2 30583 1726853796.17925: done getting next task for host managed_node2 30583 1726853796.17927: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30583 1726853796.17929: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853796.17935: getting variables 30583 1726853796.17936: in VariableManager get_vars() 30583 1726853796.17944: Calling all_inventory to load vars for managed_node2 30583 1726853796.17946: Calling groups_inventory to load vars for managed_node2 30583 1726853796.17947: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853796.17951: Calling all_plugins_play to load vars for managed_node2 30583 1726853796.17953: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853796.17954: Calling groups_plugins_play to load vars for managed_node2 30583 1726853796.18637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853796.19486: done with get_vars() 30583 1726853796.19504: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:36:36 -0400 (0:00:00.080) 0:02:11.532 ****** 30583 1726853796.19562: entering _queue_task() for managed_node2/include_tasks 30583 1726853796.19836: worker is 1 (out of 1 available) 30583 1726853796.19854: exiting _queue_task() for managed_node2/include_tasks 30583 1726853796.19870: done queuing things up, now waiting for results queue to drain 30583 1726853796.19873: waiting for pending results... 30583 1726853796.20064: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30583 1726853796.20144: in run() - task 02083763-bbaf-05ea-abc5-0000000028d3 30583 1726853796.20154: variable 'ansible_search_path' from source: unknown 30583 1726853796.20160: variable 'ansible_search_path' from source: unknown 30583 1726853796.20188: calling self._execute() 30583 1726853796.20273: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.20277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.20286: variable 'omit' from source: magic vars 30583 1726853796.20564: variable 'ansible_distribution_major_version' from source: facts 30583 1726853796.20572: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853796.20578: _execute() done 30583 1726853796.20581: dumping result to json 30583 1726853796.20583: done dumping result, returning 30583 1726853796.20590: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-05ea-abc5-0000000028d3] 30583 1726853796.20594: sending task result for task 02083763-bbaf-05ea-abc5-0000000028d3 30583 1726853796.20682: done sending task result for task 02083763-bbaf-05ea-abc5-0000000028d3 30583 1726853796.20685: WORKER PROCESS EXITING 30583 1726853796.20713: no more pending results, returning what we have 30583 1726853796.20718: in VariableManager get_vars() 30583 1726853796.20774: Calling all_inventory to load vars for managed_node2 30583 1726853796.20777: Calling groups_inventory to load vars for managed_node2 30583 1726853796.20780: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853796.20795: Calling all_plugins_play to load vars for managed_node2 30583 1726853796.20798: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853796.20801: Calling groups_plugins_play to load vars for managed_node2 30583 1726853796.21694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853796.22546: done with get_vars() 30583 1726853796.22562: variable 'ansible_search_path' from source: unknown 30583 1726853796.22563: variable 'ansible_search_path' from source: unknown 30583 1726853796.22657: variable 'item' from source: include params 30583 1726853796.22684: we have included files to process 30583 1726853796.22685: generating all_blocks data 30583 1726853796.22686: done generating all_blocks data 30583 1726853796.22687: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853796.22688: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853796.22689: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30583 1726853796.22811: done processing included file 30583 1726853796.22812: iterating over new_blocks loaded from include file 30583 1726853796.22813: in VariableManager get_vars() 30583 1726853796.22826: done with get_vars() 30583 1726853796.22827: filtering new block on tags 30583 1726853796.22842: done filtering new block on tags 30583 1726853796.22844: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30583 1726853796.22847: extending task lists for all hosts with included blocks 30583 1726853796.22939: done extending task lists 30583 1726853796.22940: done processing included files 30583 1726853796.22940: results queue empty 30583 1726853796.22941: checking for any_errors_fatal 30583 1726853796.22944: done checking for any_errors_fatal 30583 1726853796.22944: checking for max_fail_percentage 30583 1726853796.22945: done checking for max_fail_percentage 30583 1726853796.22945: checking to see if all hosts have failed and the running result is not ok 30583 1726853796.22946: done checking to see if all hosts have failed 30583 1726853796.22946: getting the remaining hosts for this loop 30583 1726853796.22947: done getting the remaining hosts for this loop 30583 1726853796.22949: getting the next task for host managed_node2 30583 1726853796.22952: done getting next task for host managed_node2 30583 1726853796.22953: ^ task is: TASK: Get stat for interface {{ interface }} 30583 1726853796.22955: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853796.22957: getting variables 30583 1726853796.22959: in VariableManager get_vars() 30583 1726853796.22967: Calling all_inventory to load vars for managed_node2 30583 1726853796.22968: Calling groups_inventory to load vars for managed_node2 30583 1726853796.22970: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853796.22975: Calling all_plugins_play to load vars for managed_node2 30583 1726853796.22976: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853796.22978: Calling groups_plugins_play to load vars for managed_node2 30583 1726853796.23610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853796.24438: done with get_vars() 30583 1726853796.24455: done getting variables 30583 1726853796.24537: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:36:36 -0400 (0:00:00.049) 0:02:11.582 ****** 30583 1726853796.24562: entering _queue_task() for managed_node2/stat 30583 1726853796.24813: worker is 1 (out of 1 available) 30583 1726853796.24828: exiting _queue_task() for managed_node2/stat 30583 1726853796.24841: done queuing things up, now waiting for results queue to drain 30583 1726853796.24843: waiting for pending results... 30583 1726853796.25029: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30583 1726853796.25119: in run() - task 02083763-bbaf-05ea-abc5-000000002979 30583 1726853796.25130: variable 'ansible_search_path' from source: unknown 30583 1726853796.25134: variable 'ansible_search_path' from source: unknown 30583 1726853796.25164: calling self._execute() 30583 1726853796.25238: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.25241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.25250: variable 'omit' from source: magic vars 30583 1726853796.25516: variable 'ansible_distribution_major_version' from source: facts 30583 1726853796.25526: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853796.25532: variable 'omit' from source: magic vars 30583 1726853796.25568: variable 'omit' from source: magic vars 30583 1726853796.25635: variable 'interface' from source: play vars 30583 1726853796.25649: variable 'omit' from source: magic vars 30583 1726853796.25683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853796.25709: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853796.25729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853796.25742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853796.25751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853796.25778: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853796.25781: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.25783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.25854: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853796.25862: Set connection var ansible_timeout to 10 30583 1726853796.25864: Set connection var ansible_connection to ssh 30583 1726853796.25867: Set connection var ansible_shell_executable to /bin/sh 30583 1726853796.25869: Set connection var ansible_shell_type to sh 30583 1726853796.25879: Set connection var ansible_pipelining to False 30583 1726853796.25896: variable 'ansible_shell_executable' from source: unknown 30583 1726853796.25899: variable 'ansible_connection' from source: unknown 30583 1726853796.25902: variable 'ansible_module_compression' from source: unknown 30583 1726853796.25904: variable 'ansible_shell_type' from source: unknown 30583 1726853796.25907: variable 'ansible_shell_executable' from source: unknown 30583 1726853796.25909: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.25911: variable 'ansible_pipelining' from source: unknown 30583 1726853796.25913: variable 'ansible_timeout' from source: unknown 30583 1726853796.25917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.26062: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30583 1726853796.26069: variable 'omit' from source: magic vars 30583 1726853796.26076: starting attempt loop 30583 1726853796.26079: running the handler 30583 1726853796.26091: _low_level_execute_command(): starting 30583 1726853796.26097: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853796.26611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853796.26615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.26619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853796.26622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.26675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853796.26679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853796.26681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853796.26763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853796.28524: stdout chunk (state=3): >>>/root <<< 30583 1726853796.28622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853796.28651: stderr chunk (state=3): >>><<< 30583 1726853796.28656: stdout chunk (state=3): >>><<< 30583 1726853796.28680: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853796.28692: _low_level_execute_command(): starting 30583 1726853796.28698: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409 `" && echo ansible-tmp-1726853796.2868013-36102-62706881524409="` echo /root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409 `" ) && sleep 0' 30583 1726853796.29119: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853796.29122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.29132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853796.29134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.29182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853796.29188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853796.29263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853796.31301: stdout chunk (state=3): >>>ansible-tmp-1726853796.2868013-36102-62706881524409=/root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409 <<< 30583 1726853796.31406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853796.31431: stderr chunk (state=3): >>><<< 30583 1726853796.31434: stdout chunk (state=3): >>><<< 30583 1726853796.31447: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853796.2868013-36102-62706881524409=/root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853796.31488: variable 'ansible_module_compression' from source: unknown 30583 1726853796.31531: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30583 1726853796.31561: variable 'ansible_facts' from source: unknown 30583 1726853796.31623: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409/AnsiballZ_stat.py 30583 1726853796.31718: Sending initial data 30583 1726853796.31722: Sent initial data (152 bytes) 30583 1726853796.32144: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853796.32149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.32152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853796.32154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853796.32157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.32202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853796.32205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853796.32284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853796.34000: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853796.34003: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853796.34068: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853796.34135: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp2k2u518u /root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409/AnsiballZ_stat.py <<< 30583 1726853796.34138: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409/AnsiballZ_stat.py" <<< 30583 1726853796.34203: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp2k2u518u" to remote "/root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409/AnsiballZ_stat.py" <<< 30583 1726853796.34206: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409/AnsiballZ_stat.py" <<< 30583 1726853796.34851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853796.34891: stderr chunk (state=3): >>><<< 30583 1726853796.34895: stdout chunk (state=3): >>><<< 30583 1726853796.34932: done transferring module to remote 30583 1726853796.34940: _low_level_execute_command(): starting 30583 1726853796.34944: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409/ /root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409/AnsiballZ_stat.py && sleep 0' 30583 1726853796.35370: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853796.35375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853796.35377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.35380: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853796.35386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.35426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853796.35430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853796.35508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853796.37412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853796.37438: stderr chunk (state=3): >>><<< 30583 1726853796.37441: stdout chunk (state=3): >>><<< 30583 1726853796.37453: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853796.37456: _low_level_execute_command(): starting 30583 1726853796.37462: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409/AnsiballZ_stat.py && sleep 0' 30583 1726853796.37869: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853796.37874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853796.37876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.37879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 <<< 30583 1726853796.37881: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.37927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853796.37930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853796.38013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853796.53752: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30583 1726853796.55160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853796.55187: stderr chunk (state=3): >>><<< 30583 1726853796.55191: stdout chunk (state=3): >>><<< 30583 1726853796.55210: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853796.55234: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853796.55243: _low_level_execute_command(): starting 30583 1726853796.55247: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853796.2868013-36102-62706881524409/ > /dev/null 2>&1 && sleep 0' 30583 1726853796.55709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853796.55712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853796.55714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853796.55716: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853796.55718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.55766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853796.55769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853796.55786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853796.55860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853796.57816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853796.57844: stderr chunk (state=3): >>><<< 30583 1726853796.57847: stdout chunk (state=3): >>><<< 30583 1726853796.57860: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853796.57869: handler run complete 30583 1726853796.57890: attempt loop complete, returning result 30583 1726853796.57894: _execute() done 30583 1726853796.57896: dumping result to json 30583 1726853796.57898: done dumping result, returning 30583 1726853796.57907: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [02083763-bbaf-05ea-abc5-000000002979] 30583 1726853796.57911: sending task result for task 02083763-bbaf-05ea-abc5-000000002979 30583 1726853796.58004: done sending task result for task 02083763-bbaf-05ea-abc5-000000002979 30583 1726853796.58007: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30583 1726853796.58063: no more pending results, returning what we have 30583 1726853796.58066: results queue empty 30583 1726853796.58067: checking for any_errors_fatal 30583 1726853796.58069: done checking for any_errors_fatal 30583 1726853796.58070: checking for max_fail_percentage 30583 1726853796.58074: done checking for max_fail_percentage 30583 1726853796.58075: checking to see if all hosts have failed and the running result is not ok 30583 1726853796.58075: done checking to see if all hosts have failed 30583 1726853796.58076: getting the remaining hosts for this loop 30583 1726853796.58078: done getting the remaining hosts for this loop 30583 1726853796.58081: getting the next task for host managed_node2 30583 1726853796.58091: done getting next task for host managed_node2 30583 1726853796.58094: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30583 1726853796.58098: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853796.58104: getting variables 30583 1726853796.58106: in VariableManager get_vars() 30583 1726853796.58151: Calling all_inventory to load vars for managed_node2 30583 1726853796.58154: Calling groups_inventory to load vars for managed_node2 30583 1726853796.58157: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853796.58169: Calling all_plugins_play to load vars for managed_node2 30583 1726853796.58178: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853796.58182: Calling groups_plugins_play to load vars for managed_node2 30583 1726853796.63539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853796.64383: done with get_vars() 30583 1726853796.64403: done getting variables 30583 1726853796.64439: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853796.64511: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:36:36 -0400 (0:00:00.399) 0:02:11.982 ****** 30583 1726853796.64532: entering _queue_task() for managed_node2/assert 30583 1726853796.64818: worker is 1 (out of 1 available) 30583 1726853796.64833: exiting _queue_task() for managed_node2/assert 30583 1726853796.64846: done queuing things up, now waiting for results queue to drain 30583 1726853796.64848: waiting for pending results... 30583 1726853796.65046: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 30583 1726853796.65135: in run() - task 02083763-bbaf-05ea-abc5-0000000028d4 30583 1726853796.65145: variable 'ansible_search_path' from source: unknown 30583 1726853796.65152: variable 'ansible_search_path' from source: unknown 30583 1726853796.65189: calling self._execute() 30583 1726853796.65263: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.65270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.65281: variable 'omit' from source: magic vars 30583 1726853796.65568: variable 'ansible_distribution_major_version' from source: facts 30583 1726853796.65580: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853796.65586: variable 'omit' from source: magic vars 30583 1726853796.65622: variable 'omit' from source: magic vars 30583 1726853796.65694: variable 'interface' from source: play vars 30583 1726853796.65709: variable 'omit' from source: magic vars 30583 1726853796.65745: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853796.65776: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853796.65793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853796.65806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853796.65817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853796.65843: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853796.65847: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.65850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.65921: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853796.65925: Set connection var ansible_timeout to 10 30583 1726853796.65928: Set connection var ansible_connection to ssh 30583 1726853796.65934: Set connection var ansible_shell_executable to /bin/sh 30583 1726853796.65936: Set connection var ansible_shell_type to sh 30583 1726853796.65945: Set connection var ansible_pipelining to False 30583 1726853796.65965: variable 'ansible_shell_executable' from source: unknown 30583 1726853796.65969: variable 'ansible_connection' from source: unknown 30583 1726853796.65974: variable 'ansible_module_compression' from source: unknown 30583 1726853796.65977: variable 'ansible_shell_type' from source: unknown 30583 1726853796.65979: variable 'ansible_shell_executable' from source: unknown 30583 1726853796.65982: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.65984: variable 'ansible_pipelining' from source: unknown 30583 1726853796.65987: variable 'ansible_timeout' from source: unknown 30583 1726853796.65989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.66096: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853796.66107: variable 'omit' from source: magic vars 30583 1726853796.66112: starting attempt loop 30583 1726853796.66115: running the handler 30583 1726853796.66219: variable 'interface_stat' from source: set_fact 30583 1726853796.66226: Evaluated conditional (not interface_stat.stat.exists): True 30583 1726853796.66231: handler run complete 30583 1726853796.66242: attempt loop complete, returning result 30583 1726853796.66245: _execute() done 30583 1726853796.66247: dumping result to json 30583 1726853796.66250: done dumping result, returning 30583 1726853796.66257: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [02083763-bbaf-05ea-abc5-0000000028d4] 30583 1726853796.66262: sending task result for task 02083763-bbaf-05ea-abc5-0000000028d4 30583 1726853796.66344: done sending task result for task 02083763-bbaf-05ea-abc5-0000000028d4 30583 1726853796.66347: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30583 1726853796.66418: no more pending results, returning what we have 30583 1726853796.66422: results queue empty 30583 1726853796.66423: checking for any_errors_fatal 30583 1726853796.66435: done checking for any_errors_fatal 30583 1726853796.66436: checking for max_fail_percentage 30583 1726853796.66438: done checking for max_fail_percentage 30583 1726853796.66439: checking to see if all hosts have failed and the running result is not ok 30583 1726853796.66439: done checking to see if all hosts have failed 30583 1726853796.66440: getting the remaining hosts for this loop 30583 1726853796.66442: done getting the remaining hosts for this loop 30583 1726853796.66446: getting the next task for host managed_node2 30583 1726853796.66454: done getting next task for host managed_node2 30583 1726853796.66460: ^ task is: TASK: Success in test '{{ lsr_description }}' 30583 1726853796.66462: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853796.66467: getting variables 30583 1726853796.66468: in VariableManager get_vars() 30583 1726853796.66511: Calling all_inventory to load vars for managed_node2 30583 1726853796.66514: Calling groups_inventory to load vars for managed_node2 30583 1726853796.66517: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853796.66526: Calling all_plugins_play to load vars for managed_node2 30583 1726853796.66529: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853796.66531: Calling groups_plugins_play to load vars for managed_node2 30583 1726853796.67347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853796.68323: done with get_vars() 30583 1726853796.68338: done getting variables 30583 1726853796.68382: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30583 1726853796.68463: variable 'lsr_description' from source: include params TASK [Success in test 'I will not get an error when I try to remove an absent profile'] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 13:36:36 -0400 (0:00:00.039) 0:02:12.022 ****** 30583 1726853796.68488: entering _queue_task() for managed_node2/debug 30583 1726853796.68730: worker is 1 (out of 1 available) 30583 1726853796.68746: exiting _queue_task() for managed_node2/debug 30583 1726853796.68763: done queuing things up, now waiting for results queue to drain 30583 1726853796.68764: waiting for pending results... 30583 1726853796.68939: running TaskExecutor() for managed_node2/TASK: Success in test 'I will not get an error when I try to remove an absent profile' 30583 1726853796.69018: in run() - task 02083763-bbaf-05ea-abc5-0000000020b4 30583 1726853796.69031: variable 'ansible_search_path' from source: unknown 30583 1726853796.69036: variable 'ansible_search_path' from source: unknown 30583 1726853796.69062: calling self._execute() 30583 1726853796.69151: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.69161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.69168: variable 'omit' from source: magic vars 30583 1726853796.69444: variable 'ansible_distribution_major_version' from source: facts 30583 1726853796.69455: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853796.69462: variable 'omit' from source: magic vars 30583 1726853796.69491: variable 'omit' from source: magic vars 30583 1726853796.69565: variable 'lsr_description' from source: include params 30583 1726853796.69580: variable 'omit' from source: magic vars 30583 1726853796.69613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853796.69645: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853796.69659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853796.69672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853796.69682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853796.69706: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853796.69709: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.69712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.69783: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853796.69787: Set connection var ansible_timeout to 10 30583 1726853796.69790: Set connection var ansible_connection to ssh 30583 1726853796.69795: Set connection var ansible_shell_executable to /bin/sh 30583 1726853796.69797: Set connection var ansible_shell_type to sh 30583 1726853796.69805: Set connection var ansible_pipelining to False 30583 1726853796.69823: variable 'ansible_shell_executable' from source: unknown 30583 1726853796.69826: variable 'ansible_connection' from source: unknown 30583 1726853796.69829: variable 'ansible_module_compression' from source: unknown 30583 1726853796.69831: variable 'ansible_shell_type' from source: unknown 30583 1726853796.69833: variable 'ansible_shell_executable' from source: unknown 30583 1726853796.69836: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.69839: variable 'ansible_pipelining' from source: unknown 30583 1726853796.69842: variable 'ansible_timeout' from source: unknown 30583 1726853796.69844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.69943: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853796.69952: variable 'omit' from source: magic vars 30583 1726853796.69957: starting attempt loop 30583 1726853796.69964: running the handler 30583 1726853796.70001: handler run complete 30583 1726853796.70011: attempt loop complete, returning result 30583 1726853796.70014: _execute() done 30583 1726853796.70017: dumping result to json 30583 1726853796.70019: done dumping result, returning 30583 1726853796.70026: done running TaskExecutor() for managed_node2/TASK: Success in test 'I will not get an error when I try to remove an absent profile' [02083763-bbaf-05ea-abc5-0000000020b4] 30583 1726853796.70029: sending task result for task 02083763-bbaf-05ea-abc5-0000000020b4 30583 1726853796.70110: done sending task result for task 02083763-bbaf-05ea-abc5-0000000020b4 30583 1726853796.70113: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'I will not get an error when I try to remove an absent profile' +++++ 30583 1726853796.70157: no more pending results, returning what we have 30583 1726853796.70163: results queue empty 30583 1726853796.70164: checking for any_errors_fatal 30583 1726853796.70174: done checking for any_errors_fatal 30583 1726853796.70175: checking for max_fail_percentage 30583 1726853796.70177: done checking for max_fail_percentage 30583 1726853796.70178: checking to see if all hosts have failed and the running result is not ok 30583 1726853796.70178: done checking to see if all hosts have failed 30583 1726853796.70179: getting the remaining hosts for this loop 30583 1726853796.70181: done getting the remaining hosts for this loop 30583 1726853796.70185: getting the next task for host managed_node2 30583 1726853796.70193: done getting next task for host managed_node2 30583 1726853796.70197: ^ task is: TASK: Cleanup 30583 1726853796.70199: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853796.70204: getting variables 30583 1726853796.70206: in VariableManager get_vars() 30583 1726853796.70248: Calling all_inventory to load vars for managed_node2 30583 1726853796.70251: Calling groups_inventory to load vars for managed_node2 30583 1726853796.70254: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853796.70266: Calling all_plugins_play to load vars for managed_node2 30583 1726853796.70269: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853796.70277: Calling groups_plugins_play to load vars for managed_node2 30583 1726853796.71075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853796.71940: done with get_vars() 30583 1726853796.71954: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 13:36:36 -0400 (0:00:00.035) 0:02:12.057 ****** 30583 1726853796.72021: entering _queue_task() for managed_node2/include_tasks 30583 1726853796.72246: worker is 1 (out of 1 available) 30583 1726853796.72264: exiting _queue_task() for managed_node2/include_tasks 30583 1726853796.72278: done queuing things up, now waiting for results queue to drain 30583 1726853796.72280: waiting for pending results... 30583 1726853796.72454: running TaskExecutor() for managed_node2/TASK: Cleanup 30583 1726853796.72533: in run() - task 02083763-bbaf-05ea-abc5-0000000020b8 30583 1726853796.72544: variable 'ansible_search_path' from source: unknown 30583 1726853796.72547: variable 'ansible_search_path' from source: unknown 30583 1726853796.72587: variable 'lsr_cleanup' from source: include params 30583 1726853796.72736: variable 'lsr_cleanup' from source: include params 30583 1726853796.72793: variable 'omit' from source: magic vars 30583 1726853796.72896: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.72905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.72914: variable 'omit' from source: magic vars 30583 1726853796.73086: variable 'ansible_distribution_major_version' from source: facts 30583 1726853796.73094: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853796.73099: variable 'item' from source: unknown 30583 1726853796.73145: variable 'item' from source: unknown 30583 1726853796.73173: variable 'item' from source: unknown 30583 1726853796.73213: variable 'item' from source: unknown 30583 1726853796.73347: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.73350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.73352: variable 'omit' from source: magic vars 30583 1726853796.73421: variable 'ansible_distribution_major_version' from source: facts 30583 1726853796.73425: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853796.73431: variable 'item' from source: unknown 30583 1726853796.73476: variable 'item' from source: unknown 30583 1726853796.73496: variable 'item' from source: unknown 30583 1726853796.73538: variable 'item' from source: unknown 30583 1726853796.73612: dumping result to json 30583 1726853796.73615: done dumping result, returning 30583 1726853796.73617: done running TaskExecutor() for managed_node2/TASK: Cleanup [02083763-bbaf-05ea-abc5-0000000020b8] 30583 1726853796.73619: sending task result for task 02083763-bbaf-05ea-abc5-0000000020b8 30583 1726853796.73651: done sending task result for task 02083763-bbaf-05ea-abc5-0000000020b8 30583 1726853796.73654: WORKER PROCESS EXITING 30583 1726853796.73736: no more pending results, returning what we have 30583 1726853796.73741: in VariableManager get_vars() 30583 1726853796.73780: Calling all_inventory to load vars for managed_node2 30583 1726853796.73783: Calling groups_inventory to load vars for managed_node2 30583 1726853796.73785: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853796.73795: Calling all_plugins_play to load vars for managed_node2 30583 1726853796.73798: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853796.73800: Calling groups_plugins_play to load vars for managed_node2 30583 1726853796.74698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853796.75536: done with get_vars() 30583 1726853796.75550: variable 'ansible_search_path' from source: unknown 30583 1726853796.75552: variable 'ansible_search_path' from source: unknown 30583 1726853796.75581: variable 'ansible_search_path' from source: unknown 30583 1726853796.75582: variable 'ansible_search_path' from source: unknown 30583 1726853796.75599: we have included files to process 30583 1726853796.75600: generating all_blocks data 30583 1726853796.75601: done generating all_blocks data 30583 1726853796.75603: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853796.75604: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853796.75605: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30583 1726853796.75730: done processing included file 30583 1726853796.75732: iterating over new_blocks loaded from include file 30583 1726853796.75732: in VariableManager get_vars() 30583 1726853796.75743: done with get_vars() 30583 1726853796.75744: filtering new block on tags 30583 1726853796.75763: done filtering new block on tags 30583 1726853796.75765: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 30583 1726853796.75768: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30583 1726853796.75769: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30583 1726853796.75773: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30583 1726853796.75991: done processing included file 30583 1726853796.75992: iterating over new_blocks loaded from include file 30583 1726853796.75993: in VariableManager get_vars() 30583 1726853796.76003: done with get_vars() 30583 1726853796.76004: filtering new block on tags 30583 1726853796.76023: done filtering new block on tags 30583 1726853796.76024: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 => (item=tasks/check_network_dns.yml) 30583 1726853796.76026: extending task lists for all hosts with included blocks 30583 1726853796.76924: done extending task lists 30583 1726853796.76925: done processing included files 30583 1726853796.76925: results queue empty 30583 1726853796.76926: checking for any_errors_fatal 30583 1726853796.76928: done checking for any_errors_fatal 30583 1726853796.76928: checking for max_fail_percentage 30583 1726853796.76929: done checking for max_fail_percentage 30583 1726853796.76930: checking to see if all hosts have failed and the running result is not ok 30583 1726853796.76930: done checking to see if all hosts have failed 30583 1726853796.76931: getting the remaining hosts for this loop 30583 1726853796.76932: done getting the remaining hosts for this loop 30583 1726853796.76933: getting the next task for host managed_node2 30583 1726853796.76936: done getting next task for host managed_node2 30583 1726853796.76938: ^ task is: TASK: Cleanup profile and device 30583 1726853796.76939: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853796.76941: getting variables 30583 1726853796.76941: in VariableManager get_vars() 30583 1726853796.76950: Calling all_inventory to load vars for managed_node2 30583 1726853796.76956: Calling groups_inventory to load vars for managed_node2 30583 1726853796.76960: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853796.76964: Calling all_plugins_play to load vars for managed_node2 30583 1726853796.76965: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853796.76967: Calling groups_plugins_play to load vars for managed_node2 30583 1726853796.77578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853796.78417: done with get_vars() 30583 1726853796.78431: done getting variables 30583 1726853796.78461: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 13:36:36 -0400 (0:00:00.064) 0:02:12.122 ****** 30583 1726853796.78483: entering _queue_task() for managed_node2/shell 30583 1726853796.78732: worker is 1 (out of 1 available) 30583 1726853796.78749: exiting _queue_task() for managed_node2/shell 30583 1726853796.78765: done queuing things up, now waiting for results queue to drain 30583 1726853796.78766: waiting for pending results... 30583 1726853796.78957: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 30583 1726853796.79038: in run() - task 02083763-bbaf-05ea-abc5-00000000299e 30583 1726853796.79050: variable 'ansible_search_path' from source: unknown 30583 1726853796.79055: variable 'ansible_search_path' from source: unknown 30583 1726853796.79087: calling self._execute() 30583 1726853796.79162: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.79169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.79180: variable 'omit' from source: magic vars 30583 1726853796.79460: variable 'ansible_distribution_major_version' from source: facts 30583 1726853796.79474: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853796.79480: variable 'omit' from source: magic vars 30583 1726853796.79512: variable 'omit' from source: magic vars 30583 1726853796.79625: variable 'interface' from source: play vars 30583 1726853796.79641: variable 'omit' from source: magic vars 30583 1726853796.79680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853796.79707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853796.79724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853796.79737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853796.79749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853796.79778: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853796.79782: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.79784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.79858: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853796.79868: Set connection var ansible_timeout to 10 30583 1726853796.79873: Set connection var ansible_connection to ssh 30583 1726853796.79875: Set connection var ansible_shell_executable to /bin/sh 30583 1726853796.79878: Set connection var ansible_shell_type to sh 30583 1726853796.79886: Set connection var ansible_pipelining to False 30583 1726853796.79903: variable 'ansible_shell_executable' from source: unknown 30583 1726853796.79907: variable 'ansible_connection' from source: unknown 30583 1726853796.79909: variable 'ansible_module_compression' from source: unknown 30583 1726853796.79911: variable 'ansible_shell_type' from source: unknown 30583 1726853796.79913: variable 'ansible_shell_executable' from source: unknown 30583 1726853796.79915: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853796.79918: variable 'ansible_pipelining' from source: unknown 30583 1726853796.79920: variable 'ansible_timeout' from source: unknown 30583 1726853796.79925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853796.80026: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853796.80036: variable 'omit' from source: magic vars 30583 1726853796.80040: starting attempt loop 30583 1726853796.80043: running the handler 30583 1726853796.80052: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853796.80073: _low_level_execute_command(): starting 30583 1726853796.80076: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853796.80595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853796.80598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.80601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853796.80603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.80656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853796.80664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853796.80667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853796.80746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853796.82518: stdout chunk (state=3): >>>/root <<< 30583 1726853796.82619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853796.82648: stderr chunk (state=3): >>><<< 30583 1726853796.82652: stdout chunk (state=3): >>><<< 30583 1726853796.82674: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853796.82687: _low_level_execute_command(): starting 30583 1726853796.82692: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580 `" && echo ansible-tmp-1726853796.8267365-36113-129566159688580="` echo /root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580 `" ) && sleep 0' 30583 1726853796.83117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853796.83128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853796.83130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.83133: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853796.83135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.83181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853796.83187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853796.83261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853796.85318: stdout chunk (state=3): >>>ansible-tmp-1726853796.8267365-36113-129566159688580=/root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580 <<< 30583 1726853796.85681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853796.85685: stdout chunk (state=3): >>><<< 30583 1726853796.85687: stderr chunk (state=3): >>><<< 30583 1726853796.85691: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853796.8267365-36113-129566159688580=/root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853796.85693: variable 'ansible_module_compression' from source: unknown 30583 1726853796.85725: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853796.85769: variable 'ansible_facts' from source: unknown 30583 1726853796.85865: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580/AnsiballZ_command.py 30583 1726853796.86097: Sending initial data 30583 1726853796.86101: Sent initial data (156 bytes) 30583 1726853796.86622: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30583 1726853796.86636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853796.86690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.86757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853796.86784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853796.86848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853796.88551: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853796.88555: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853796.88620: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853796.88696: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp2purr1k5 /root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580/AnsiballZ_command.py <<< 30583 1726853796.88699: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580/AnsiballZ_command.py" <<< 30583 1726853796.88758: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp2purr1k5" to remote "/root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580/AnsiballZ_command.py" <<< 30583 1726853796.88765: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580/AnsiballZ_command.py" <<< 30583 1726853796.89426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853796.89469: stderr chunk (state=3): >>><<< 30583 1726853796.89474: stdout chunk (state=3): >>><<< 30583 1726853796.89493: done transferring module to remote 30583 1726853796.89502: _low_level_execute_command(): starting 30583 1726853796.89507: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580/ /root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580/AnsiballZ_command.py && sleep 0' 30583 1726853796.89927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853796.89931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853796.89933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853796.89938: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853796.89941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.89977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853796.89996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853796.90063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853796.91999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853796.92024: stderr chunk (state=3): >>><<< 30583 1726853796.92027: stdout chunk (state=3): >>><<< 30583 1726853796.92038: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853796.92041: _low_level_execute_command(): starting 30583 1726853796.92046: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580/AnsiballZ_command.py && sleep 0' 30583 1726853796.92463: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853796.92467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853796.92469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853796.92472: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853796.92475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853796.92522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853796.92525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853796.92617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853797.11777: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 13:36:37.084310", "end": "2024-09-20 13:36:37.116641", "delta": "0:00:00.032331", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853797.13400: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. <<< 30583 1726853797.13427: stderr chunk (state=3): >>><<< 30583 1726853797.13430: stdout chunk (state=3): >>><<< 30583 1726853797.13452: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 13:36:37.084310", "end": "2024-09-20 13:36:37.116641", "delta": "0:00:00.032331", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.197 closed. 30583 1726853797.13486: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853797.13495: _low_level_execute_command(): starting 30583 1726853797.13500: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853796.8267365-36113-129566159688580/ > /dev/null 2>&1 && sleep 0' 30583 1726853797.13945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.13948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853797.13950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853797.13952: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.13954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.14008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853797.14012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853797.14017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853797.14091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853797.15996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853797.16022: stderr chunk (state=3): >>><<< 30583 1726853797.16025: stdout chunk (state=3): >>><<< 30583 1726853797.16039: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853797.16044: handler run complete 30583 1726853797.16063: Evaluated conditional (False): False 30583 1726853797.16073: attempt loop complete, returning result 30583 1726853797.16076: _execute() done 30583 1726853797.16078: dumping result to json 30583 1726853797.16084: done dumping result, returning 30583 1726853797.16092: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [02083763-bbaf-05ea-abc5-00000000299e] 30583 1726853797.16096: sending task result for task 02083763-bbaf-05ea-abc5-00000000299e 30583 1726853797.16190: done sending task result for task 02083763-bbaf-05ea-abc5-00000000299e 30583 1726853797.16192: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.032331", "end": "2024-09-20 13:36:37.116641", "rc": 1, "start": "2024-09-20 13:36:37.084310" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30583 1726853797.16256: no more pending results, returning what we have 30583 1726853797.16261: results queue empty 30583 1726853797.16262: checking for any_errors_fatal 30583 1726853797.16263: done checking for any_errors_fatal 30583 1726853797.16264: checking for max_fail_percentage 30583 1726853797.16266: done checking for max_fail_percentage 30583 1726853797.16267: checking to see if all hosts have failed and the running result is not ok 30583 1726853797.16267: done checking to see if all hosts have failed 30583 1726853797.16268: getting the remaining hosts for this loop 30583 1726853797.16270: done getting the remaining hosts for this loop 30583 1726853797.16275: getting the next task for host managed_node2 30583 1726853797.16285: done getting next task for host managed_node2 30583 1726853797.16289: ^ task is: TASK: Check routes and DNS 30583 1726853797.16294: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853797.16298: getting variables 30583 1726853797.16300: in VariableManager get_vars() 30583 1726853797.16347: Calling all_inventory to load vars for managed_node2 30583 1726853797.16350: Calling groups_inventory to load vars for managed_node2 30583 1726853797.16353: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853797.16364: Calling all_plugins_play to load vars for managed_node2 30583 1726853797.16367: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853797.16369: Calling groups_plugins_play to load vars for managed_node2 30583 1726853797.17301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853797.18163: done with get_vars() 30583 1726853797.18181: done getting variables 30583 1726853797.18223: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:36:37 -0400 (0:00:00.397) 0:02:12.519 ****** 30583 1726853797.18246: entering _queue_task() for managed_node2/shell 30583 1726853797.18491: worker is 1 (out of 1 available) 30583 1726853797.18505: exiting _queue_task() for managed_node2/shell 30583 1726853797.18517: done queuing things up, now waiting for results queue to drain 30583 1726853797.18518: waiting for pending results... 30583 1726853797.18716: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 30583 1726853797.18789: in run() - task 02083763-bbaf-05ea-abc5-0000000029a2 30583 1726853797.18801: variable 'ansible_search_path' from source: unknown 30583 1726853797.18805: variable 'ansible_search_path' from source: unknown 30583 1726853797.18833: calling self._execute() 30583 1726853797.18915: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853797.18919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853797.18928: variable 'omit' from source: magic vars 30583 1726853797.19211: variable 'ansible_distribution_major_version' from source: facts 30583 1726853797.19221: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853797.19227: variable 'omit' from source: magic vars 30583 1726853797.19258: variable 'omit' from source: magic vars 30583 1726853797.19289: variable 'omit' from source: magic vars 30583 1726853797.19322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853797.19350: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853797.19373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853797.19387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853797.19399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853797.19425: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853797.19429: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853797.19431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853797.19507: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853797.19510: Set connection var ansible_timeout to 10 30583 1726853797.19513: Set connection var ansible_connection to ssh 30583 1726853797.19515: Set connection var ansible_shell_executable to /bin/sh 30583 1726853797.19518: Set connection var ansible_shell_type to sh 30583 1726853797.19527: Set connection var ansible_pipelining to False 30583 1726853797.19544: variable 'ansible_shell_executable' from source: unknown 30583 1726853797.19547: variable 'ansible_connection' from source: unknown 30583 1726853797.19550: variable 'ansible_module_compression' from source: unknown 30583 1726853797.19552: variable 'ansible_shell_type' from source: unknown 30583 1726853797.19555: variable 'ansible_shell_executable' from source: unknown 30583 1726853797.19557: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853797.19563: variable 'ansible_pipelining' from source: unknown 30583 1726853797.19565: variable 'ansible_timeout' from source: unknown 30583 1726853797.19569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853797.19675: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853797.19683: variable 'omit' from source: magic vars 30583 1726853797.19688: starting attempt loop 30583 1726853797.19691: running the handler 30583 1726853797.19700: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853797.19718: _low_level_execute_command(): starting 30583 1726853797.19726: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853797.20240: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.20244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.20247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30583 1726853797.20250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.20304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853797.20307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853797.20313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853797.20390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853797.22162: stdout chunk (state=3): >>>/root <<< 30583 1726853797.22261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853797.22290: stderr chunk (state=3): >>><<< 30583 1726853797.22294: stdout chunk (state=3): >>><<< 30583 1726853797.22315: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853797.22327: _low_level_execute_command(): starting 30583 1726853797.22332: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616 `" && echo ansible-tmp-1726853797.223153-36124-47212428084616="` echo /root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616 `" ) && sleep 0' 30583 1726853797.22760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.22773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.22776: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853797.22778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853797.22780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.22824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853797.22827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853797.22831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853797.22904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853797.24969: stdout chunk (state=3): >>>ansible-tmp-1726853797.223153-36124-47212428084616=/root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616 <<< 30583 1726853797.25084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853797.25106: stderr chunk (state=3): >>><<< 30583 1726853797.25109: stdout chunk (state=3): >>><<< 30583 1726853797.25122: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853797.223153-36124-47212428084616=/root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853797.25152: variable 'ansible_module_compression' from source: unknown 30583 1726853797.25193: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853797.25223: variable 'ansible_facts' from source: unknown 30583 1726853797.25283: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616/AnsiballZ_command.py 30583 1726853797.25384: Sending initial data 30583 1726853797.25388: Sent initial data (154 bytes) 30583 1726853797.25808: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853797.25813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853797.25816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.25818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.25820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.25876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853797.25879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853797.25883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853797.25955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853797.27661: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853797.27728: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853797.27798: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpz7lvgu4e /root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616/AnsiballZ_command.py <<< 30583 1726853797.27802: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616/AnsiballZ_command.py" <<< 30583 1726853797.27868: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmpz7lvgu4e" to remote "/root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616/AnsiballZ_command.py" <<< 30583 1726853797.27874: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616/AnsiballZ_command.py" <<< 30583 1726853797.28533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853797.28575: stderr chunk (state=3): >>><<< 30583 1726853797.28578: stdout chunk (state=3): >>><<< 30583 1726853797.28613: done transferring module to remote 30583 1726853797.28621: _low_level_execute_command(): starting 30583 1726853797.28627: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616/ /root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616/AnsiballZ_command.py && sleep 0' 30583 1726853797.29054: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.29058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853797.29066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.29068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.29072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.29123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853797.29128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853797.29130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853797.29198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853797.31104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853797.31129: stderr chunk (state=3): >>><<< 30583 1726853797.31132: stdout chunk (state=3): >>><<< 30583 1726853797.31143: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853797.31146: _low_level_execute_command(): starting 30583 1726853797.31151: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616/AnsiballZ_command.py && sleep 0' 30583 1726853797.31556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.31561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853797.31564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853797.31566: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.31568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.31612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853797.31615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853797.31699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853797.48675: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:bc:da:29:a4:45 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.197/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2947sec preferred_lft 2947sec\n inet6 fe80::10bc:daff:fe29:a445/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.197 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.197 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:36:37.476184", "end": "2024-09-20 13:36:37.485374", "delta": "0:00:00.009190", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853797.50395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853797.50425: stderr chunk (state=3): >>><<< 30583 1726853797.50428: stdout chunk (state=3): >>><<< 30583 1726853797.50444: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:bc:da:29:a4:45 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.197/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2947sec preferred_lft 2947sec\n inet6 fe80::10bc:daff:fe29:a445/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.197 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.197 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:36:37.476184", "end": "2024-09-20 13:36:37.485374", "delta": "0:00:00.009190", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853797.50486: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853797.50492: _low_level_execute_command(): starting 30583 1726853797.50497: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853797.223153-36124-47212428084616/ > /dev/null 2>&1 && sleep 0' 30583 1726853797.50949: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.50953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.50955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853797.50957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.51012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853797.51016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853797.51020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853797.51094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853797.53020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853797.53045: stderr chunk (state=3): >>><<< 30583 1726853797.53048: stdout chunk (state=3): >>><<< 30583 1726853797.53066: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853797.53074: handler run complete 30583 1726853797.53093: Evaluated conditional (False): False 30583 1726853797.53105: attempt loop complete, returning result 30583 1726853797.53107: _execute() done 30583 1726853797.53110: dumping result to json 30583 1726853797.53115: done dumping result, returning 30583 1726853797.53123: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [02083763-bbaf-05ea-abc5-0000000029a2] 30583 1726853797.53128: sending task result for task 02083763-bbaf-05ea-abc5-0000000029a2 30583 1726853797.53229: done sending task result for task 02083763-bbaf-05ea-abc5-0000000029a2 30583 1726853797.53232: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009190", "end": "2024-09-20 13:36:37.485374", "rc": 0, "start": "2024-09-20 13:36:37.476184" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:bc:da:29:a4:45 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.197/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2947sec preferred_lft 2947sec inet6 fe80::10bc:daff:fe29:a445/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.197 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.197 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 30583 1726853797.53299: no more pending results, returning what we have 30583 1726853797.53303: results queue empty 30583 1726853797.53304: checking for any_errors_fatal 30583 1726853797.53318: done checking for any_errors_fatal 30583 1726853797.53319: checking for max_fail_percentage 30583 1726853797.53320: done checking for max_fail_percentage 30583 1726853797.53321: checking to see if all hosts have failed and the running result is not ok 30583 1726853797.53322: done checking to see if all hosts have failed 30583 1726853797.53323: getting the remaining hosts for this loop 30583 1726853797.53324: done getting the remaining hosts for this loop 30583 1726853797.53328: getting the next task for host managed_node2 30583 1726853797.53336: done getting next task for host managed_node2 30583 1726853797.53339: ^ task is: TASK: Verify DNS and network connectivity 30583 1726853797.53347: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853797.53355: getting variables 30583 1726853797.53357: in VariableManager get_vars() 30583 1726853797.53402: Calling all_inventory to load vars for managed_node2 30583 1726853797.53405: Calling groups_inventory to load vars for managed_node2 30583 1726853797.53408: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853797.53418: Calling all_plugins_play to load vars for managed_node2 30583 1726853797.53421: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853797.53423: Calling groups_plugins_play to load vars for managed_node2 30583 1726853797.54246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853797.55226: done with get_vars() 30583 1726853797.55242: done getting variables 30583 1726853797.55287: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:36:37 -0400 (0:00:00.370) 0:02:12.890 ****** 30583 1726853797.55311: entering _queue_task() for managed_node2/shell 30583 1726853797.55544: worker is 1 (out of 1 available) 30583 1726853797.55558: exiting _queue_task() for managed_node2/shell 30583 1726853797.55572: done queuing things up, now waiting for results queue to drain 30583 1726853797.55574: waiting for pending results... 30583 1726853797.55763: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 30583 1726853797.55840: in run() - task 02083763-bbaf-05ea-abc5-0000000029a3 30583 1726853797.55853: variable 'ansible_search_path' from source: unknown 30583 1726853797.55856: variable 'ansible_search_path' from source: unknown 30583 1726853797.55890: calling self._execute() 30583 1726853797.55973: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853797.55977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853797.55986: variable 'omit' from source: magic vars 30583 1726853797.56273: variable 'ansible_distribution_major_version' from source: facts 30583 1726853797.56283: Evaluated conditional (ansible_distribution_major_version != '6'): True 30583 1726853797.56382: variable 'ansible_facts' from source: unknown 30583 1726853797.56837: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 30583 1726853797.56842: variable 'omit' from source: magic vars 30583 1726853797.56877: variable 'omit' from source: magic vars 30583 1726853797.56901: variable 'omit' from source: magic vars 30583 1726853797.56934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30583 1726853797.56964: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30583 1726853797.56983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30583 1726853797.56999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853797.57009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30583 1726853797.57032: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30583 1726853797.57036: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853797.57039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853797.57121: Set connection var ansible_module_compression to ZIP_DEFLATED 30583 1726853797.57126: Set connection var ansible_timeout to 10 30583 1726853797.57129: Set connection var ansible_connection to ssh 30583 1726853797.57134: Set connection var ansible_shell_executable to /bin/sh 30583 1726853797.57137: Set connection var ansible_shell_type to sh 30583 1726853797.57144: Set connection var ansible_pipelining to False 30583 1726853797.57164: variable 'ansible_shell_executable' from source: unknown 30583 1726853797.57167: variable 'ansible_connection' from source: unknown 30583 1726853797.57170: variable 'ansible_module_compression' from source: unknown 30583 1726853797.57174: variable 'ansible_shell_type' from source: unknown 30583 1726853797.57177: variable 'ansible_shell_executable' from source: unknown 30583 1726853797.57179: variable 'ansible_host' from source: host vars for 'managed_node2' 30583 1726853797.57181: variable 'ansible_pipelining' from source: unknown 30583 1726853797.57183: variable 'ansible_timeout' from source: unknown 30583 1726853797.57188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30583 1726853797.57292: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853797.57303: variable 'omit' from source: magic vars 30583 1726853797.57308: starting attempt loop 30583 1726853797.57310: running the handler 30583 1726853797.57322: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30583 1726853797.57336: _low_level_execute_command(): starting 30583 1726853797.57342: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30583 1726853797.57855: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.57864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853797.57867: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853797.57870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.57917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853797.57920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853797.57922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853797.58006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853797.59738: stdout chunk (state=3): >>>/root <<< 30583 1726853797.59840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853797.59869: stderr chunk (state=3): >>><<< 30583 1726853797.59875: stdout chunk (state=3): >>><<< 30583 1726853797.59895: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853797.59908: _low_level_execute_command(): starting 30583 1726853797.59911: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486 `" && echo ansible-tmp-1726853797.5989385-36132-57538685905486="` echo /root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486 `" ) && sleep 0' 30583 1726853797.60344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.60348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.60350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853797.60353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853797.60357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.60408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853797.60412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853797.60417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853797.60491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853797.62510: stdout chunk (state=3): >>>ansible-tmp-1726853797.5989385-36132-57538685905486=/root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486 <<< 30583 1726853797.62619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853797.62644: stderr chunk (state=3): >>><<< 30583 1726853797.62647: stdout chunk (state=3): >>><<< 30583 1726853797.62660: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853797.5989385-36132-57538685905486=/root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853797.62689: variable 'ansible_module_compression' from source: unknown 30583 1726853797.62732: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30583c3ru6b16/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30583 1726853797.62761: variable 'ansible_facts' from source: unknown 30583 1726853797.62820: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486/AnsiballZ_command.py 30583 1726853797.62915: Sending initial data 30583 1726853797.62919: Sent initial data (155 bytes) 30583 1726853797.63344: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.63347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853797.63349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.63352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.63354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.63409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853797.63415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853797.63417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853797.63487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853797.65192: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30583 1726853797.65197: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30583 1726853797.65261: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30583 1726853797.65332: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp60hb3caq /root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486/AnsiballZ_command.py <<< 30583 1726853797.65336: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486/AnsiballZ_command.py" <<< 30583 1726853797.65399: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30583c3ru6b16/tmp60hb3caq" to remote "/root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486/AnsiballZ_command.py" <<< 30583 1726853797.66034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853797.66074: stderr chunk (state=3): >>><<< 30583 1726853797.66077: stdout chunk (state=3): >>><<< 30583 1726853797.66105: done transferring module to remote 30583 1726853797.66113: _low_level_execute_command(): starting 30583 1726853797.66117: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486/ /root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486/AnsiballZ_command.py && sleep 0' 30583 1726853797.66535: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.66542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.66545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853797.66547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found <<< 30583 1726853797.66549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.66593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853797.66597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853797.66676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853797.68550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853797.68576: stderr chunk (state=3): >>><<< 30583 1726853797.68579: stdout chunk (state=3): >>><<< 30583 1726853797.68593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853797.68597: _low_level_execute_command(): starting 30583 1726853797.68600: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486/AnsiballZ_command.py && sleep 0' 30583 1726853797.69021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853797.69024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853797.69026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30583 1726853797.69028: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853797.69030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853797.69077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853797.69092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853797.69095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853797.69166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853798.05629: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6745 0 --:--:-- --:--:-- --:--:-- 6777\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2538 0 --:--:-- --:--:-- --:--:-- 2552", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:36:37.850390", "end": "2024-09-20 13:36:38.054993", "delta": "0:00:00.204603", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30583 1726853798.07345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. <<< 30583 1726853798.07376: stderr chunk (state=3): >>><<< 30583 1726853798.07379: stdout chunk (state=3): >>><<< 30583 1726853798.07399: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6745 0 --:--:-- --:--:-- --:--:-- 6777\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2538 0 --:--:-- --:--:-- --:--:-- 2552", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:36:37.850390", "end": "2024-09-20 13:36:38.054993", "delta": "0:00:00.204603", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.197 closed. 30583 1726853798.07438: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30583 1726853798.07445: _low_level_execute_command(): starting 30583 1726853798.07450: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853797.5989385-36132-57538685905486/ > /dev/null 2>&1 && sleep 0' 30583 1726853798.07906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30583 1726853798.07909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found <<< 30583 1726853798.07912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853798.07914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30583 1726853798.07916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30583 1726853798.07976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' <<< 30583 1726853798.07979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30583 1726853798.07980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30583 1726853798.08043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30583 1726853798.09977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30583 1726853798.10005: stderr chunk (state=3): >>><<< 30583 1726853798.10008: stdout chunk (state=3): >>><<< 30583 1726853798.10021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.197 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.197 originally 10.31.9.197 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/429203141d' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30583 1726853798.10028: handler run complete 30583 1726853798.10046: Evaluated conditional (False): False 30583 1726853798.10056: attempt loop complete, returning result 30583 1726853798.10062: _execute() done 30583 1726853798.10064: dumping result to json 30583 1726853798.10066: done dumping result, returning 30583 1726853798.10073: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [02083763-bbaf-05ea-abc5-0000000029a3] 30583 1726853798.10078: sending task result for task 02083763-bbaf-05ea-abc5-0000000029a3 30583 1726853798.10187: done sending task result for task 02083763-bbaf-05ea-abc5-0000000029a3 30583 1726853798.10190: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.204603", "end": "2024-09-20 13:36:38.054993", "rc": 0, "start": "2024-09-20 13:36:37.850390" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 6745 0 --:--:-- --:--:-- --:--:-- 6777 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2538 0 --:--:-- --:--:-- --:--:-- 2552 30583 1726853798.10255: no more pending results, returning what we have 30583 1726853798.10261: results queue empty 30583 1726853798.10262: checking for any_errors_fatal 30583 1726853798.10276: done checking for any_errors_fatal 30583 1726853798.10277: checking for max_fail_percentage 30583 1726853798.10280: done checking for max_fail_percentage 30583 1726853798.10281: checking to see if all hosts have failed and the running result is not ok 30583 1726853798.10281: done checking to see if all hosts have failed 30583 1726853798.10282: getting the remaining hosts for this loop 30583 1726853798.10284: done getting the remaining hosts for this loop 30583 1726853798.10292: getting the next task for host managed_node2 30583 1726853798.10303: done getting next task for host managed_node2 30583 1726853798.10305: ^ task is: TASK: meta (flush_handlers) 30583 1726853798.10307: ^ state is: HOST STATE: block=9, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853798.10312: getting variables 30583 1726853798.10314: in VariableManager get_vars() 30583 1726853798.10356: Calling all_inventory to load vars for managed_node2 30583 1726853798.10362: Calling groups_inventory to load vars for managed_node2 30583 1726853798.10365: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853798.10380: Calling all_plugins_play to load vars for managed_node2 30583 1726853798.10382: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853798.10385: Calling groups_plugins_play to load vars for managed_node2 30583 1726853798.11224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853798.12085: done with get_vars() 30583 1726853798.12103: done getting variables 30583 1726853798.12151: in VariableManager get_vars() 30583 1726853798.12162: Calling all_inventory to load vars for managed_node2 30583 1726853798.12164: Calling groups_inventory to load vars for managed_node2 30583 1726853798.12165: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853798.12168: Calling all_plugins_play to load vars for managed_node2 30583 1726853798.12170: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853798.12173: Calling groups_plugins_play to load vars for managed_node2 30583 1726853798.12899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853798.13748: done with get_vars() 30583 1726853798.13769: done queuing things up, now waiting for results queue to drain 30583 1726853798.13770: results queue empty 30583 1726853798.13772: checking for any_errors_fatal 30583 1726853798.13775: done checking for any_errors_fatal 30583 1726853798.13775: checking for max_fail_percentage 30583 1726853798.13776: done checking for max_fail_percentage 30583 1726853798.13776: checking to see if all hosts have failed and the running result is not ok 30583 1726853798.13777: done checking to see if all hosts have failed 30583 1726853798.13777: getting the remaining hosts for this loop 30583 1726853798.13778: done getting the remaining hosts for this loop 30583 1726853798.13780: getting the next task for host managed_node2 30583 1726853798.13782: done getting next task for host managed_node2 30583 1726853798.13784: ^ task is: TASK: meta (flush_handlers) 30583 1726853798.13785: ^ state is: HOST STATE: block=10, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853798.13786: getting variables 30583 1726853798.13787: in VariableManager get_vars() 30583 1726853798.13794: Calling all_inventory to load vars for managed_node2 30583 1726853798.13796: Calling groups_inventory to load vars for managed_node2 30583 1726853798.13797: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853798.13800: Calling all_plugins_play to load vars for managed_node2 30583 1726853798.13802: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853798.13803: Calling groups_plugins_play to load vars for managed_node2 30583 1726853798.14447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853798.15278: done with get_vars() 30583 1726853798.15291: done getting variables 30583 1726853798.15323: in VariableManager get_vars() 30583 1726853798.15331: Calling all_inventory to load vars for managed_node2 30583 1726853798.15332: Calling groups_inventory to load vars for managed_node2 30583 1726853798.15334: Calling all_plugins_inventory to load vars for managed_node2 30583 1726853798.15338: Calling all_plugins_play to load vars for managed_node2 30583 1726853798.15339: Calling groups_plugins_inventory to load vars for managed_node2 30583 1726853798.15341: Calling groups_plugins_play to load vars for managed_node2 30583 1726853798.16020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30583 1726853798.16869: done with get_vars() 30583 1726853798.16888: done queuing things up, now waiting for results queue to drain 30583 1726853798.16890: results queue empty 30583 1726853798.16891: checking for any_errors_fatal 30583 1726853798.16891: done checking for any_errors_fatal 30583 1726853798.16892: checking for max_fail_percentage 30583 1726853798.16892: done checking for max_fail_percentage 30583 1726853798.16893: checking to see if all hosts have failed and the running result is not ok 30583 1726853798.16893: done checking to see if all hosts have failed 30583 1726853798.16894: getting the remaining hosts for this loop 30583 1726853798.16894: done getting the remaining hosts for this loop 30583 1726853798.16896: getting the next task for host managed_node2 30583 1726853798.16899: done getting next task for host managed_node2 30583 1726853798.16899: ^ task is: None 30583 1726853798.16900: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30583 1726853798.16901: done queuing things up, now waiting for results queue to drain 30583 1726853798.16901: results queue empty 30583 1726853798.16902: checking for any_errors_fatal 30583 1726853798.16902: done checking for any_errors_fatal 30583 1726853798.16903: checking for max_fail_percentage 30583 1726853798.16904: done checking for max_fail_percentage 30583 1726853798.16904: checking to see if all hosts have failed and the running result is not ok 30583 1726853798.16904: done checking to see if all hosts have failed 30583 1726853798.16906: getting the next task for host managed_node2 30583 1726853798.16907: done getting next task for host managed_node2 30583 1726853798.16908: ^ task is: None 30583 1726853798.16908: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=334 changed=10 unreachable=0 failed=0 skipped=312 rescued=0 ignored=10 Friday 20 September 2024 13:36:38 -0400 (0:00:00.616) 0:02:13.507 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.09s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.02s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.00s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.97s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.96s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.93s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.92s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.89s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.87s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.86s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.85s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.85s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.84s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.70s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.70s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 Gathering Facts --------------------------------------------------------- 1.14s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.96s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 30583 1726853798.17128: RUNNING CLEANUP